Fitting SSF models to the buffalo data

Dynamic step selection functions with temporal harmonics - dry season

Author

Scott Forrest

Published

February 11, 2025

Load packages

Loading packages
options(scipen=999)

library(tidyverse)
packages <- c("amt", "lubridate", "survival", "terra", "tictoc", 
              "beepr", "ggpubr", "MASS", "patchwork", "scales")
walk(packages, require, character.only = T)

Importing buffalo data

Import the buffalo data with random steps and extracted covariates that we created for the paper Forrest et al. (2024), in the script Ecography_DynamicSSF_1_Step_generation. This repo can be found at: swforrest/dynamic_SSF_sims.

Here we create the sine and cosine terms that were interact with each of the covariates to fit temporally varying parameters.

Importing data
buffalo_data_all <- read_csv("data/buffalo_parametric_popn_covs_GvM_10rs_2024-09-04.csv")
Rows: 1165406 Columns: 22
── Column specification ────────────────────────────────────────────────────────
Delimiter: ","
dbl  (18): id, burst_, x1_, x2_, y1_, y2_, sl_, ta_, dt_, hour_t2, step_id_,...
lgl   (1): case_
dttm  (3): t1_, t2_, t2_rounded

ℹ Use `spec()` to retrieve the full column specification for this data.
ℹ Specify the column types or set `show_col_types = FALSE` to quiet this message.
Importing data
buffalo_data_all <- buffalo_data_all %>%
  mutate(t1_ = lubridate::with_tz(buffalo_data_all$t1_, tzone = "Australia/Darwin"),
         t2_ = lubridate::with_tz(buffalo_data_all$t2_, tzone = "Australia/Darwin"))

buffalo_data_all <- buffalo_data_all %>%
  mutate(id_num = as.numeric(factor(id)),
         step_id = step_id_,
         x1 = x1_, x2 = x2_,
         y1 = y1_, y2 = y2_,
         t1 = t1_,
         t1_rounded = round_date(buffalo_data_all$t1_, "hour"),
         hour_t1 = hour(t1_rounded),
         t2 = t2_,
         t2_rounded = round_date(buffalo_data_all$t2_, "hour"),
         hour_t2 = hour(t2_rounded),
         hour = hour_t2,
         yday = yday(t1_),
         year = year(t1_),
         month = month(t1_),
         sl = sl_,
         log_sl = log(sl_),
         ta = ta_,
         cos_ta = cos(ta_),
         # scale canopy cover from 0 to 1
         canopy_01 = canopy_cover/100,
         # here we create the harmonic terms for the hour of the day
         # for seasonal effects, change hour to yday (which is tau in the manuscript), 
         # and 24 to 365 (which is T)
         hour_s1 = sin(2*pi*hour/24),
         hour_s2 = sin(4*pi*hour/24),
         hour_s3 = sin(6*pi*hour/24),
         hour_c1 = cos(2*pi*hour/24),
         hour_c2 = cos(4*pi*hour/24),
         hour_c3 = cos(6*pi*hour/24))

# to select a single year of data
# buffalo_data_all <- buffalo_data_all %>% filter(t1 < "2019-07-25 09:32:42 ACST")

buffalo_ids <- unique(buffalo_data_all$id)

# Timeline of buffalo data
buffalo_data_all %>% ggplot(aes(x = t1, y = factor(id), colour = factor(id))) +
  geom_point(alpha = 0.1) +
  scale_y_discrete("Buffalo ID") +
  scale_x_datetime("Date") +
  scale_colour_viridis_d() +
  theme_bw() +
  theme(legend.position = "none")

Fitting the models

Creating a data matrix

First we create a data matrix to be provided to the model, and then we scale and centre the full data matrix, with respect to each of the columns. That means that all variables are scaled and centred after the data has been split into wet and dry season data, and also after creating the quadratic and harmonic terms (when using them).

We should only include covariates in the data matrix that will be used in the model formula.

Models

  • 0p = 0 pairs of harmonics
  • 1p = 1 pair of harmonics
  • 2p = 2 pairs of harmonics
  • 3p = 3 pairs of harmonics

For the dynamic models, we start to add the harmonic terms. As we have already created the harmonic terms for the hour of the day (s1, c1, s2, etc), we just interact (multiply) these with each of the covariates, including the quadratic terms, prior to model fitting. We store the scaling and centering variables to reconstruct the natural scale coefficients.

To provide some intuition about harmonic regression we have created a walkthrough script for Forrest et al. (2024), in the script Ecography_DynamicSSF_Walkthrough_Harmonics_and_selection_surfaces, which can be found at: swforrest/dynamic_SSF_sims, that introduces harmonics and how they can be used to model temporal variation in the data. It will help provide some understand the model outputs and how we construct the temporally varying coefficients in this script.

Selecting data

Code
months_wet <- c(1:4, 11:12)
buffalo_ids <- unique(buffalo_data_all$id)
focal_id <- 2005

# comment and uncomment the relevant lines to get either wet or dry season data
# buffalo_data <- buffalo_data_all %>% filter(id == focal_id & month %in% months_wet) # wet season
buffalo_data <- buffalo_data_all %>% filter(id == focal_id & !month %in% months_wet) # dry season

# all data
# buffalo_data <- buffalo_data_all %>% filter(id == focal_id)
Code
buffalo_data_matrix_unscaled <- buffalo_data %>% transmute(
  
  ndvi = ndvi_temporal,
  ndvi_sq = ndvi_temporal ^ 2,
  canopy = canopy_01,
  canopy_sq = canopy_01 ^ 2,
  slope = slope,
  herby = veg_herby,
  step_l = sl,
  log_step_l = log_sl,
  cos_turn_a = cos_ta)

buffalo_data_matrix_scaled <- scale(buffalo_data_matrix_unscaled)

# save the scaling values to recover the natural scale of the coefficients
# which is required for the simulations 
# (so then environmental variables don't need to be scaled)
mean_vals <- attr(buffalo_data_matrix_scaled, "scaled:center")
sd_vals <- attr(buffalo_data_matrix_scaled, "scaled:scale")
scaling_attributes_0p <- data.frame(variable = names(buffalo_data_matrix_unscaled), 
                                    mean = mean_vals, sd = sd_vals)

# add the id, step_id columns and presence/absence columns to 
# the scaled data matrix for model fitting
buffalo_data_scaled_0p <- data.frame(id = buffalo_data$id,  
                                     step_id = buffalo_data$step_id, 
                                     y = buffalo_data$y, 
                                     buffalo_data_matrix_scaled)
Code
buffalo_data_matrix_unscaled <- buffalo_data %>% transmute(
  
  # the 'linear' term
  ndvi = ndvi_temporal,
  # interact with the harmonic terms
  ndvi_s1 = ndvi_temporal * hour_s1,
  ndvi_c1 = ndvi_temporal * hour_c1,
  
  ndvi_sq = ndvi_temporal ^ 2,
  ndvi_sq_s1 = (ndvi_temporal ^ 2) * hour_s1,
  ndvi_sq_c1 = (ndvi_temporal ^ 2) * hour_c1,
  
  canopy = canopy_01,
  canopy_s1 = canopy_01 * hour_s1,
  canopy_c1 = canopy_01 * hour_c1,
  
  canopy_sq = canopy_01 ^ 2,
  canopy_sq_s1 = (canopy_01 ^ 2) * hour_s1,
  canopy_sq_c1 = (canopy_01 ^ 2) * hour_c1,
  
  slope = slope,
  slope_s1 = slope * hour_s1,
  slope_c1 = slope * hour_c1,
  
  herby = veg_herby,
  herby_s1 = veg_herby * hour_s1,
  herby_c1 = veg_herby * hour_c1,
  
  step_l = sl,
  step_l_s1 = sl * hour_s1,
  step_l_c1 = sl * hour_c1,

  log_step_l = log_sl,
  log_step_l_s1 = log_sl * hour_s1,
  log_step_l_c1 = log_sl * hour_c1,

  cos_turn_a = cos_ta,
  cos_turn_a_s1 = cos_ta * hour_s1,
  cos_turn_a_c1 = cos_ta * hour_c1)

buffalo_data_matrix_scaled <- scale(buffalo_data_matrix_unscaled)

mean_vals <- attr(buffalo_data_matrix_scaled, "scaled:center")
sd_vals <- attr(buffalo_data_matrix_scaled, "scaled:scale")
scaling_attributes_1p <- data.frame(variable = names(buffalo_data_matrix_unscaled), 
                                    mean = mean_vals, sd = sd_vals)

buffalo_data_scaled_1p <- data.frame(id = buffalo_data$id,  
                                     step_id = buffalo_data$step_id, 
                                     y = buffalo_data$y, 
                                     buffalo_data_matrix_scaled)
Code
buffalo_data_matrix_unscaled <- buffalo_data %>% transmute(
  
  ndvi = ndvi_temporal,
  ndvi_s1 = ndvi_temporal * hour_s1,
  ndvi_s2 = ndvi_temporal * hour_s2,
  ndvi_c1 = ndvi_temporal * hour_c1,
  ndvi_c2 = ndvi_temporal * hour_c2,
  
  ndvi_sq = ndvi_temporal ^ 2,
  ndvi_sq_s1 = (ndvi_temporal ^ 2) * hour_s1,
  ndvi_sq_s2 = (ndvi_temporal ^ 2) * hour_s2,
  ndvi_sq_c1 = (ndvi_temporal ^ 2) * hour_c1,
  ndvi_sq_c2 = (ndvi_temporal ^ 2) * hour_c2,
  
  canopy = canopy_01,
  canopy_s1 = canopy_01 * hour_s1,
  canopy_s2 = canopy_01 * hour_s2,
  canopy_c1 = canopy_01 * hour_c1,
  canopy_c2 = canopy_01 * hour_c2,
  
  canopy_sq = canopy_01 ^ 2,
  canopy_sq_s1 = (canopy_01 ^ 2) * hour_s1,
  canopy_sq_s2 = (canopy_01 ^ 2) * hour_s2,
  canopy_sq_c1 = (canopy_01 ^ 2) * hour_c1,
  canopy_sq_c2 = (canopy_01 ^ 2) * hour_c2,
  
  slope = slope,
  slope_s1 = slope * hour_s1,
  slope_s2 = slope * hour_s2,
  slope_c1 = slope * hour_c1,
  slope_c2 = slope * hour_c2,
  
  herby = veg_herby,
  herby_s1 = veg_herby * hour_s1,
  herby_s2 = veg_herby * hour_s2,
  herby_c1 = veg_herby * hour_c1,
  herby_c2 = veg_herby * hour_c2,
  
  step_l = sl,
  step_l_s1 = sl * hour_s1,
  step_l_s2 = sl * hour_s2,
  step_l_c1 = sl * hour_c1,
  step_l_c2 = sl * hour_c2,

  log_step_l = log_sl,
  log_step_l_s1 = log_sl * hour_s1,
  log_step_l_s2 = log_sl * hour_s2,
  log_step_l_c1 = log_sl * hour_c1,
  log_step_l_c2 = log_sl * hour_c2,

  cos_turn_a = cos_ta,
  cos_turn_a_s1 = cos_ta * hour_s1,
  cos_turn_a_s2 = cos_ta * hour_s2,
  cos_turn_a_c1 = cos_ta * hour_c1,
  cos_turn_a_c2 = cos_ta * hour_c2)

buffalo_data_matrix_scaled <- scale(buffalo_data_matrix_unscaled)

mean_vals <- attr(buffalo_data_matrix_scaled, "scaled:center")
sd_vals <- attr(buffalo_data_matrix_scaled, "scaled:scale")
scaling_attributes_2p <- data.frame(variable = names(buffalo_data_matrix_unscaled), 
                                    mean = mean_vals, sd = sd_vals)

buffalo_data_scaled_2p <- data.frame(id = buffalo_data$id,  
                                     step_id = buffalo_data$step_id, 
                                     y = buffalo_data$y, 
                                     buffalo_data_matrix_scaled)
Code
buffalo_data_matrix_unscaled <- buffalo_data %>% transmute(
  
  ndvi = ndvi_temporal,
  ndvi_s1 = ndvi_temporal * hour_s1,
  ndvi_s2 = ndvi_temporal * hour_s2,
  ndvi_s3 = ndvi_temporal * hour_s3,
  ndvi_c1 = ndvi_temporal * hour_c1,
  ndvi_c2 = ndvi_temporal * hour_c2,
  ndvi_c3 = ndvi_temporal * hour_c3,   
  
  ndvi_sq = ndvi_temporal ^ 2,
  ndvi_sq_s1 = (ndvi_temporal ^ 2) * hour_s1,
  ndvi_sq_s2 = (ndvi_temporal ^ 2) * hour_s2,
  ndvi_sq_s3 = (ndvi_temporal ^ 2) * hour_s3,
  ndvi_sq_c1 = (ndvi_temporal ^ 2) * hour_c1,
  ndvi_sq_c2 = (ndvi_temporal ^ 2) * hour_c2,
  ndvi_sq_c3 = (ndvi_temporal ^ 2) * hour_c3,   
  
  canopy = canopy_01,
  canopy_s1 = canopy_01 * hour_s1,
  canopy_s2 = canopy_01 * hour_s2,
  canopy_s3 = canopy_01 * hour_s3,
  canopy_c1 = canopy_01 * hour_c1,
  canopy_c2 = canopy_01 * hour_c2,
  canopy_c3 = canopy_01 * hour_c3,   
  
  canopy_sq = canopy_01 ^ 2,
  canopy_sq_s1 = (canopy_01 ^ 2) * hour_s1,
  canopy_sq_s2 = (canopy_01 ^ 2) * hour_s2,
  canopy_sq_s3 = (canopy_01 ^ 2) * hour_s3,
  canopy_sq_c1 = (canopy_01 ^ 2) * hour_c1,
  canopy_sq_c2 = (canopy_01 ^ 2) * hour_c2,
  canopy_sq_c3 = (canopy_01 ^ 2) * hour_c3,   
  
  slope = slope,
  slope_s1 = slope * hour_s1,
  slope_s2 = slope * hour_s2,
  slope_s3 = slope * hour_s3,
  slope_c1 = slope * hour_c1,
  slope_c2 = slope * hour_c2,
  slope_c3 = slope * hour_c3,   
  
  herby = veg_herby,
  herby_s1 = veg_herby * hour_s1,
  herby_s2 = veg_herby * hour_s2,
  herby_s3 = veg_herby * hour_s3,
  herby_c1 = veg_herby * hour_c1,
  herby_c2 = veg_herby * hour_c2,
  herby_c3 = veg_herby * hour_c3,     
  
  step_l = sl,
  step_l_s1 = sl * hour_s1,
  step_l_s2 = sl * hour_s2,
  step_l_s3 = sl * hour_s3,
  step_l_c1 = sl * hour_c1,
  step_l_c2 = sl * hour_c2,
  step_l_c3 = sl * hour_c3,

  log_step_l = log_sl,
  log_step_l_s1 = log_sl * hour_s1,
  log_step_l_s2 = log_sl * hour_s2,
  log_step_l_s3 = log_sl * hour_s3,
  log_step_l_c1 = log_sl * hour_c1,
  log_step_l_c2 = log_sl * hour_c2,
  log_step_l_c3 = log_sl * hour_c3,

  cos_turn_a = cos_ta,
  cos_turn_a_s1 = cos_ta * hour_s1,
  cos_turn_a_s2 = cos_ta * hour_s2,
  cos_turn_a_s3 = cos_ta * hour_s3,
  cos_turn_a_c1 = cos_ta * hour_c1,
  cos_turn_a_c2 = cos_ta * hour_c2,
  cos_turn_a_c3 = cos_ta * hour_c3)

buffalo_data_matrix_scaled <- scale(buffalo_data_matrix_unscaled)

mean_vals <- attr(buffalo_data_matrix_scaled, "scaled:center")
sd_vals <- attr(buffalo_data_matrix_scaled, "scaled:scale")
scaling_attributes_3p <- data.frame(variable = names(buffalo_data_matrix_unscaled), 
                                    mean = mean_vals, sd = sd_vals)

buffalo_data_scaled_3p <- data.frame(id = buffalo_data$id,  
                                     step_id = buffalo_data$step_id, 
                                     y = buffalo_data$y, 
                                     buffalo_data_matrix_scaled)

Model formula

As we have already precomputed and scaled the covariates, quadratic terms and interactions with the harmonics, we just include each parameter as a linear predictor.

Code
formula_0p <- y ~ 
  
  ndvi +
  ndvi_sq +
  canopy +
  canopy_sq +
  slope +
  herby +
  step_l +
  log_step_l +
  cos_turn_a +
  
  strata(step_id)
Code
formula_1p <- y ~ 
  
  ndvi +
  ndvi_s1 +
  ndvi_c1 +
  
  ndvi_sq +
  ndvi_sq_s1 +
  ndvi_sq_c1 +
  
  canopy +
  canopy_s1 +
  canopy_c1 +
  
  canopy_sq +
  canopy_sq_s1 +
  canopy_sq_c1 +
  
  slope +
  slope_s1 +
  slope_c1 +
  
  herby +
  herby_s1 +
  herby_c1 +
  
  step_l +
  step_l_s1 +
  step_l_c1 +
  
  log_step_l +
  log_step_l_s1 +
  log_step_l_c1 +

  cos_turn_a +
  cos_turn_a_s1 +
  cos_turn_a_c1 +
  
  strata(step_id) 
Code
formula_2p <- y ~ 
  
  ndvi +
  ndvi_s1 +
  ndvi_s2 +
  ndvi_c1 +
  ndvi_c2 +
  
  ndvi_sq +
  ndvi_sq_s1 +
  ndvi_sq_s2 +
  ndvi_sq_c1 +
  ndvi_sq_c2 +
  
  canopy +
  canopy_s1 +
  canopy_s2 +
  canopy_c1 +
  canopy_c2 +
  
  canopy_sq +
  canopy_sq_s1 +
  canopy_sq_s2 +
  canopy_sq_c1 +
  canopy_sq_c2 +
  
  slope +
  slope_s1 +
  slope_s2 +
  slope_c1 +
  slope_c2 +
  
  herby +
  herby_s1 +
  herby_s2 +
  herby_c1 +
  herby_c2 +
  
  step_l +
  step_l_s1 +
  step_l_s2 +
  step_l_c1 +
  step_l_c2 +
  
  log_step_l +
  log_step_l_s1 +
  log_step_l_s2 +
  log_step_l_c1 +
  log_step_l_c2 +

  cos_turn_a +
  cos_turn_a_s1 +
  cos_turn_a_s2 +
  cos_turn_a_c1 +
  cos_turn_a_c2 +
  
  strata(step_id) 
Code
formula_3p <- y ~ 
  
  ndvi +
  ndvi_s1 +
  ndvi_s2 +
  ndvi_s3 +  
  ndvi_c1 +
  ndvi_c2 +
  ndvi_c3 +  
  
  ndvi_sq +
  ndvi_sq_s1 +
  ndvi_sq_s2 +
  ndvi_sq_s3 +  
  ndvi_sq_c1 +
  ndvi_sq_c2 +
  ndvi_sq_c3 +  
  
  canopy +
  canopy_s1 +
  canopy_s2 +
  canopy_s3 +  
  canopy_c1 +
  canopy_c2 +
  canopy_c3 +  
  
  canopy_sq +
  canopy_sq_s1 +
  canopy_sq_s2 +
  canopy_sq_s3 +  
  canopy_sq_c1 +
  canopy_sq_c2 +
  canopy_sq_c3 +  
  
  slope +
  slope_s1 +
  slope_s2 +
  slope_s3 +  
  slope_c1 +
  slope_c2 +
  slope_c3 +  
  
  herby +
  herby_s1 +
  herby_s2 +
  herby_s3 +  
  herby_c1 +
  herby_c2 +
  herby_c3 +   
  
  step_l +
  step_l_s1 +
  step_l_s2 +
  step_l_s3 +  
  step_l_c1 +
  step_l_c2 +
  step_l_c3 +  
  
  log_step_l +
  log_step_l_s1 +
  log_step_l_s2 +
  log_step_l_s3 +  
  log_step_l_c1 +
  log_step_l_c2 +
  log_step_l_c3 +  
  
  cos_turn_a +
  cos_turn_a_s1 +
  cos_turn_a_s2 +
  cos_turn_a_s3 +  
  cos_turn_a_c1 +
  cos_turn_a_c2 +
  cos_turn_a_c3 +
  
  strata(step_id)

Fit the model

As we have already fitted the model, we will load it here, but if the model_fit file doesn’t exist, it will run the model fitting code. Be careful here that if you change the model formula, you will need to delete or rename the model_fit file to re-run the model fitting code, otherwise it will just load the previous model.

We are fitting a single model to the focal individual.

Code
if(file.exists(paste0("ssf_coefficients/model_id", focal_id, "_0p_harms_dry.rds"))) {
  
  model_0p_harms <- readRDS(paste0("ssf_coefficients/model_id", focal_id, "_0p_harms_dry.rds"))
  print("Read existing model")
  
} else {
  
  tic()
    model_0p_harms <- fit_clogit(formula = formula_0p,
                                         data = buffalo_data_scaled_0p)
  toc()
  
  # save model object
  saveRDS(model_0p_harms, file = paste0("ssf_coefficients/model_id", focal_id, "_0p_harms_dry.rds"))
  
  print("Fitted model")
  beep(sound = 2)

}
[1] "Read existing model"
Code
model_0p_harms
$model
Call:
survival::clogit(formula, data = data, ...)

               coef exp(coef) se(coef)      z             p
ndvi       -0.14980   0.86088  0.11297 -1.326       0.18485
ndvi_sq     0.27865   1.32135  0.11046  2.523       0.01165
canopy      0.15724   1.17028  0.08122  1.936       0.05287
canopy_sq  -0.34153   0.71068  0.08337 -4.096 0.00004196797
slope      -0.12180   0.88532  0.02321 -5.248 0.00000015358
herby      -0.12591   0.88170  0.02115 -5.953 0.00000000263
step_l     -0.12264   0.88459  0.02043 -6.003 0.00000000194
log_step_l  0.05489   1.05643  0.01843  2.979       0.00289
cos_turn_a -0.01038   0.98967  0.01391 -0.746       0.45549

Likelihood ratio test=201.3  on 9 df, p=< 0.00000000000000022
n= 65082, number of events= 5692 
   (1710 observations deleted due to missingness)

$sl_
NULL

$ta_
NULL

$more
NULL

attr(,"class")
[1] "fit_clogit" "list"      
Code
if(file.exists(paste0("ssf_coefficients/model_id", focal_id, "_1p_harms_dry.rds"))) {
  
  model_1p_harms <- readRDS(paste0("ssf_coefficients/model_id", focal_id, "_1p_harms_dry.rds"))
  print("Read existing model")
  
} else {

  tic()
  model_1p_harms <- fit_clogit(formula = formula_1p,
                                       data = buffalo_data_scaled_1p)
  toc()
  
  # save model object
  saveRDS(model_1p_harms, file = paste0("ssf_coefficients/model_id", focal_id, "_1p_harms_dry.rds"))
  
  print("Fitted model")
  beep(sound = 2)

}
[1] "Read existing model"
Code
model_1p_harms
$model
Call:
survival::clogit(formula, data = data, ...)

                    coef  exp(coef)   se(coef)       z                    p
ndvi          -0.0101419  0.9899094  0.1206239  -0.084             0.932994
ndvi_s1        0.4908711  1.6337388  0.3715837   1.321             0.186493
ndvi_c1       -4.4668428  0.0114835  0.3953803 -11.298 < 0.0000000000000002
ndvi_sq        0.1253231  1.1335147  0.1182051   1.060             0.289046
ndvi_sq_s1    -0.2226478  0.8003967  0.1992793  -1.117             0.263881
ndvi_sq_c1     2.4757350 11.8904435  0.2148066  11.525 < 0.0000000000000002
canopy         0.3244912  1.3833266  0.1058701   3.065             0.002177
canopy_s1     -0.4622047  0.6298934  0.2734603  -1.690             0.090988
canopy_c1      1.4695710  4.3473699  0.3187982   4.610    0.000004032087403
canopy_sq     -0.5245070  0.5918471  0.1054087  -4.976    0.000000649329204
canopy_sq_s1   0.2844858  1.3290785  0.1718349   1.656             0.097808
canopy_sq_c1  -1.1510663  0.3162993  0.1974561  -5.829    0.000000005559997
slope         -0.1310712  0.8771553  0.0248421  -5.276    0.000000131906676
slope_s1      -0.1367048  0.8722277  0.0359432  -3.803             0.000143
slope_c1       0.0946943  1.0993228  0.0388848   2.435             0.014881
herby         -0.0988075  0.9059171  0.0231510  -4.268    0.000019726332577
herby_s1       0.0235596  1.0238393  0.0487167   0.484             0.628667
herby_c1       0.2579470  1.2942703  0.0516724   4.992    0.000000597667434
step_l        -0.2124163  0.8086280  0.0229516  -9.255 < 0.0000000000000002
step_l_s1      0.0584180  1.0601580  0.0259511   2.251             0.024381
step_l_c1      0.0003032  1.0003033  0.0269674   0.011             0.991029
log_step_l     0.1975952  1.2184690  0.0220287   8.970 < 0.0000000000000002
log_step_l_s1 -0.2827514  0.7537071  0.0384039  -7.363    0.000000000000180
log_step_l_c1 -0.6017845  0.5478332  0.0399545 -15.062 < 0.0000000000000002
cos_turn_a    -0.0032234  0.9967817  0.0142967  -0.225             0.821615
cos_turn_a_s1 -0.1031140  0.9020241  0.0142727  -7.225    0.000000000000503
cos_turn_a_c1 -0.1663023  0.8467902  0.0144650 -11.497 < 0.0000000000000002

Likelihood ratio test=1182  on 27 df, p=< 0.00000000000000022
n= 65082, number of events= 5692 
   (1710 observations deleted due to missingness)

$sl_
NULL

$ta_
NULL

$more
NULL

attr(,"class")
[1] "fit_clogit" "list"      
Code
if(file.exists(paste0("ssf_coefficients/model_id", focal_id, "_2p_harms_dry.rds"))) {
  
  model_2p_harms <- readRDS(paste0("ssf_coefficients/model_id", focal_id, "_2p_harms_dry.rds"))
  print("Read existing model")
  
} else {

  tic()
  model_2p_harms <- fit_clogit(formula = formula_2p,
           data = buffalo_data_scaled_2p)
  toc()
  
  # save model object
  saveRDS(model_2p_harms, file = paste0("ssf_coefficients/model_id", focal_id, "_2p_harms_dry.rds"))
  
  print("Fitted model")
  beep(sound = 2)

}
[1] "Read existing model"
Code
model_2p_harms
$model
Call:
survival::clogit(formula, data = data, ...)

                    coef  exp(coef)   se(coef)       z                    p
ndvi          -0.0105652  0.9894904  0.1270414  -0.083             0.933721
ndvi_s1        0.2529427  1.2878094  0.3735203   0.677             0.498288
ndvi_s2       -0.0669366  0.9352545  0.3970369  -0.169             0.866119
ndvi_c1       -4.3850800  0.0124619  0.4303893 -10.189 < 0.0000000000000002
ndvi_c2        0.7367382  2.0891102  0.4013043   1.836             0.066378
ndvi_sq        0.1289442  1.1376266  0.1241104   1.039             0.298829
ndvi_sq_s1    -0.1044400  0.9008289  0.2005602  -0.521             0.602547
ndvi_sq_s2     0.0414484  1.0423194  0.2150427   0.193             0.847159
ndvi_sq_c1     2.4382075 11.4524935  0.2330174  10.464 < 0.0000000000000002
ndvi_sq_c2    -0.2188485  0.8034434  0.2169097  -1.009             0.313004
canopy         0.3923837  1.4805057  0.1213583   3.233             0.001224
canopy_s1     -0.3372183  0.7137530  0.3200240  -1.054             0.292007
canopy_s2      0.0568951  1.0585448  0.3048123   0.187             0.851930
canopy_c1      1.8083945  6.1006452  0.4137692   4.371    0.000012394011529
canopy_c2      0.4700311  1.6000439  0.3169358   1.483             0.138062
canopy_sq     -0.5794630  0.5601991  0.1194847  -4.850    0.000001236592905
canopy_sq_s1   0.2184749  1.2441777  0.1980022   1.103             0.269855
canopy_sq_s2   0.0913285  1.0956288  0.1893414   0.482             0.629559
canopy_sq_c1  -1.3075321  0.2704868  0.2514654  -5.200    0.000000199662891
canopy_sq_c2  -0.1476704  0.8627155  0.1966837  -0.751             0.452772
slope         -0.1481119  0.8623346  0.0274139  -5.403    0.000000065606762
slope_s1      -0.1146156  0.8917089  0.0355056  -3.228             0.001246
slope_s2       0.0347404  1.0353509  0.0395861   0.878             0.380166
slope_c1       0.0764744  1.0794746  0.0439889   1.738             0.082124
slope_c2      -0.0869779  0.9166974  0.0401580  -2.166             0.030319
herby         -0.0977996  0.9068306  0.0241066  -4.057    0.000049713104414
herby_s1       0.0157429  1.0158675  0.0515717   0.305             0.760166
herby_s2       0.0002868  1.0002868  0.0510424   0.006             0.995517
herby_c1       0.1969839  1.2177244  0.0560673   3.513             0.000443
herby_c2      -0.0702094  0.9321986  0.0518900  -1.353             0.176042
step_l        -0.4526089  0.6359668  0.0311392 -14.535 < 0.0000000000000002
step_l_s1     -0.0553466  0.9461571  0.0256130  -2.161             0.030704
step_l_s2     -0.3384349  0.7128852  0.0299257 -11.309 < 0.0000000000000002
step_l_c1     -0.1758616  0.8387341  0.0374920  -4.691    0.000002723473773
step_l_c2     -0.3561954  0.7003358  0.0313576 -11.359 < 0.0000000000000002
log_step_l     0.2615324  1.2989190  0.0231012  11.321 < 0.0000000000000002
log_step_l_s1 -0.2391907  0.7872647  0.0434073  -5.510    0.000000035807221
log_step_l_s2 -0.0774179  0.9255030  0.0409704  -1.890             0.058811
log_step_l_c1 -0.5169396  0.5963428  0.0430368 -12.012 < 0.0000000000000002
log_step_l_c2 -0.0678024  0.9344451  0.0410499  -1.652             0.098594
cos_turn_a    -0.0005184  0.9994818  0.0145390  -0.036             0.971559
cos_turn_a_s1 -0.1076412  0.8979497  0.0145766  -7.385    0.000000000000153
cos_turn_a_s2 -0.1385594  0.8706116  0.0145725  -9.508 < 0.0000000000000002
cos_turn_a_c1 -0.1575777  0.8542105  0.0146656 -10.745 < 0.0000000000000002
cos_turn_a_c2 -0.0838055  0.9196101  0.0146155  -5.734    0.000000009808054

Likelihood ratio test=1882  on 45 df, p=< 0.00000000000000022
n= 65082, number of events= 5692 
   (1710 observations deleted due to missingness)

$sl_
NULL

$ta_
NULL

$more
NULL

attr(,"class")
[1] "fit_clogit" "list"      
Code
if(file.exists(paste0("ssf_coefficients/model_id", focal_id, "_3p_harms_dry.rds"))) {
  
  model_3p_harms <- readRDS(paste0("ssf_coefficients/model_id", focal_id, "_3p_harms_dry.rds"))
  print("Read existing model")
  
} else {

  tic()
  model_3p_harms <- fit_clogit(formula = formula_3p,
                                     data = buffalo_data_scaled_3p)
  toc()
  
  # save model object
  saveRDS(model_3p_harms, file = paste0("ssf_coefficients/model_id", focal_id, "_3p_harms_dry.rds"))
  
  print("Fitted model")
  beep(sound = 2)

}
[1] "Read existing model"
Code
model_3p_harms
$model
Call:
survival::clogit(formula, data = data, ...)

                    coef  exp(coef)   se(coef)       z                    p
ndvi           0.0706066  1.0731590  0.1277746   0.553             0.580546
ndvi_s1        0.2245451  1.2517532  0.3887548   0.578             0.563534
ndvi_s2       -0.0194743  0.9807141  0.4057951  -0.048             0.961724
ndvi_s3       -0.4608978  0.6307171  0.3997941  -1.153             0.248977
ndvi_c1       -4.3895782  0.0124060  0.4283404 -10.248 < 0.0000000000000002
ndvi_c2        0.3976964  1.4883922  0.4073673   0.976             0.328936
ndvi_c3       -0.1096700  0.8961298  0.4068839  -0.270             0.787517
ndvi_sq        0.0549287  1.0564653  0.1250750   0.439             0.660541
ndvi_sq_s1    -0.0861213  0.9174829  0.2092987  -0.411             0.680724
ndvi_sq_s2     0.0160391  1.0161684  0.2194906   0.073             0.941747
ndvi_sq_s3     0.2579377  1.2942581  0.2173907   1.187             0.235418
ndvi_sq_c1     2.4283014 11.3396042  0.2318705  10.473 < 0.0000000000000002
ndvi_sq_c2    -0.0671742  0.9350323  0.2201701  -0.305             0.760289
ndvi_sq_c3    -0.1488389  0.8617080  0.2201537  -0.676             0.498998
canopy         0.3484793  1.4169112  0.1195513   2.915             0.003558
canopy_s1     -0.0816157  0.9216261  0.3580034  -0.228             0.819666
canopy_s2      0.3298074  1.3907002  0.3732663   0.884             0.376928
canopy_s3      0.4476350  1.5646075  0.3106624   1.441             0.149612
canopy_c1      1.5732214  4.8221573  0.4011492   3.922   0.0000878948645124
canopy_c2      0.1330122  1.1422639  0.3693462   0.360             0.718751
canopy_c3     -0.5230254  0.5927246  0.3195681  -1.637             0.101701
canopy_sq     -0.5483508  0.5779021  0.1178490  -4.653   0.0000032715062221
canopy_sq_s1   0.0559853  1.0575822  0.2187516   0.256             0.798004
canopy_sq_s2  -0.0614058  0.9404416  0.2278020  -0.270             0.787501
canopy_sq_s3  -0.2526854  0.7767122  0.1930056  -1.309             0.190462
canopy_sq_c1  -1.1755242  0.3086571  0.2439113  -4.819   0.0000014393650984
canopy_sq_c2   0.0592535  1.0610442  0.2258879   0.262             0.793080
canopy_sq_c3   0.3889201  1.4753867  0.1976682   1.968             0.049121
slope         -0.1533516  0.8578280  0.0272870  -5.620   0.0000000191020063
slope_s1      -0.1037130  0.9014840  0.0378366  -2.741             0.006124
slope_s2       0.0328810  1.0334275  0.0403754   0.814             0.415427
slope_s3       0.0212681  1.0214958  0.0395627   0.538             0.590868
slope_c1       0.0702020  1.0727248  0.0442970   1.585             0.113012
slope_c2      -0.0658964  0.9362278  0.0404277  -1.630             0.103105
slope_c3      -0.0483677  0.9527834  0.0408586  -1.184             0.236499
herby         -0.0901833  0.9137637  0.0241294  -3.737             0.000186
herby_s1       0.0027006  1.0027043  0.0528976   0.051             0.959282
herby_s2      -0.0046719  0.9953390  0.0539226  -0.087             0.930957
herby_s3      -0.0807781  0.9223983  0.0518024  -1.559             0.118914
herby_c1       0.2100050  1.2336842  0.0556708   3.772             0.000162
herby_c2      -0.0465425  0.9545240  0.0544014  -0.856             0.392253
herby_c3      -0.0182664  0.9818995  0.0520511  -0.351             0.725640
step_l        -0.4738944  0.6225730  0.0307875 -15.392 < 0.0000000000000002
step_l_s1     -0.0053407  0.9946735  0.0319965  -0.167             0.867436
step_l_s2     -0.2740341  0.7603062  0.0323389  -8.474 < 0.0000000000000002
step_l_s3     -0.0112702  0.9887931  0.0316372  -0.356             0.721667
step_l_c1     -0.0423870  0.9584988  0.0402960  -1.052             0.292850
step_l_c2     -0.2555728  0.7744727  0.0332400  -7.689   0.0000000000000149
step_l_c3     -0.0193143  0.9808710  0.0315001  -0.613             0.539777
log_step_l     0.3819615  1.4651557  0.0261920  14.583 < 0.0000000000000002
log_step_l_s1 -0.2611291  0.7701815  0.0508703  -5.133   0.0000002848068786
log_step_l_s2 -0.1991106  0.8194592  0.0461490  -4.315   0.0000159953500496
log_step_l_s3  0.5267168  1.6933635  0.0434312  12.128 < 0.0000000000000002
log_step_l_c1 -0.6766963  0.5082935  0.0435923 -15.523 < 0.0000000000000002
log_step_l_c2 -0.3136382  0.7307834  0.0460771  -6.807   0.0000000000099782
log_step_l_c3  0.3861357  1.4712843  0.0430061   8.979 < 0.0000000000000002
cos_turn_a     0.0004812  1.0004813  0.0147369   0.033             0.973954
cos_turn_a_s1 -0.1017963  0.9032136  0.0149425  -6.813   0.0000000000095893
cos_turn_a_s2 -0.1369483  0.8720153  0.0147571  -9.280 < 0.0000000000000002
cos_turn_a_s3  0.1443249  1.1552594  0.0148551   9.716 < 0.0000000000000002
cos_turn_a_c1 -0.1648113  0.8480537  0.0147472 -11.176 < 0.0000000000000002
cos_turn_a_c2 -0.0915187  0.9125443  0.0148916  -6.146   0.0000000007962441
cos_turn_a_c3  0.0491386  1.0503659  0.0147343   3.335             0.000853

Likelihood ratio test=2347  on 63 df, p=< 0.00000000000000022
n= 65082, number of events= 5692 
   (1710 observations deleted due to missingness)

$sl_
NULL

$ta_
NULL

$more
NULL

attr(,"class")
[1] "fit_clogit" "list"      

Check the fitted model outputs

Create a dataframe of the coefficients with the scaling attributes that we saved when creating the data matrix. We can then return the coefficients to their natural scale by dividing by the scaling factor (standard deviation).

As we can see, we have a coefficient for each covariate by itself, and then one for each of the harmonic interactions. These are the ‘weights’ that we played around with in the Ecography_DynamicSSF_Walkthrough_Harmonics_and_selection_surfaces walkthrough script in: swforrest/dynamic_SSF_sims, and we reconstruct them in exactly the same way. We also have the coefficients for the quadratic terms and the interactions with the harmonics, which we have denoted as ndvi_sq for instance. We will come back to these when we look at the selection surfaces.

Code
model_0p_harms
$model
Call:
survival::clogit(formula, data = data, ...)

               coef exp(coef) se(coef)      z             p
ndvi       -0.14980   0.86088  0.11297 -1.326       0.18485
ndvi_sq     0.27865   1.32135  0.11046  2.523       0.01165
canopy      0.15724   1.17028  0.08122  1.936       0.05287
canopy_sq  -0.34153   0.71068  0.08337 -4.096 0.00004196797
slope      -0.12180   0.88532  0.02321 -5.248 0.00000015358
herby      -0.12591   0.88170  0.02115 -5.953 0.00000000263
step_l     -0.12264   0.88459  0.02043 -6.003 0.00000000194
log_step_l  0.05489   1.05643  0.01843  2.979       0.00289
cos_turn_a -0.01038   0.98967  0.01391 -0.746       0.45549

Likelihood ratio test=201.3  on 9 df, p=< 0.00000000000000022
n= 65082, number of events= 5692 
   (1710 observations deleted due to missingness)

$sl_
NULL

$ta_
NULL

$more
NULL

attr(,"class")
[1] "fit_clogit" "list"      
Code
# these create massive outputs for the dynamic models so we've commented them out
# model_0p_harms$model$coefficients
# model_0p_harms$se
# model_0p_harms$vcov
# diag(model_0p_harms$D) # between cluster variance
# model_0p_harms$r.effect # individual estimates

# create a dataframe of the coefficients and their scaling attributes
coefs_clr_0p <- data.frame(coefs = names(model_0p_harms$model$coefficients), 
                           value = model_0p_harms$model$coefficients)
coefs_clr_0p$scale_sd <- scaling_attributes_0p$sd
coefs_clr_0p <- coefs_clr_0p %>% mutate(value_nat = value / scale_sd)
head(coefs_clr_0p)
Code
# creates a huge output due to the correlation matrix
# model_1p_harms

# model_1p_harms
# model_1p_harms$model$coefficients
# model_1p_harms$se
# model_1p_harms$vcov
# diag(model_1p_harms$D) # between cluster variance
# model_1p_harms$r.effect # individual estimates

coefs_clr_1p <- data.frame(coefs = names(model_1p_harms$model$coefficients), 
                           value = model_1p_harms$model$coefficients)
coefs_clr_1p$scale_sd <- scaling_attributes_1p$sd
coefs_clr_1p <- coefs_clr_1p %>% mutate(value_nat = value / scale_sd)
head(coefs_clr_1p)
Code
# creates a huge output due to the correlation matrix
# model_2p_harms

# model_2p_harms
# model_2p_harms$model$coefficients
# model_2p_harms$se
# model_2p_harms$vcov
# diag(model_2p_harms$D) # between cluster variance
# model_2p_harms$r.effect # individual estimates

# creating data frame of model coefficients
coefs_clr_2p <- data.frame(coefs = names(model_2p_harms$model$coefficients), 
                           value = model_2p_harms$model$coefficients)
coefs_clr_2p$scale_sd <- scaling_attributes_2p$sd
coefs_clr_2p <- coefs_clr_2p %>% mutate(value_nat = value / scale_sd)
head(coefs_clr_2p)
Code
# creates a huge output due to the correlation matrix
# model_3p_harms

# model_3p_harms$model$coefficients
# model_3p_harms$se
# model_3p_harms$vcov
# diag(model_3p_harms$D) # between cluster variance
# model_3p_harms$r.effect # individual estimates

# creating dataframe of coefficients
coefs_clr_3p <- data.frame(coefs = names(model_3p_harms$model$coefficients), 
                           value = model_3p_harms$model$coefficients)
coefs_clr_3p$scale_sd <- scaling_attributes_3p$sd
coefs_clr_3p <- coefs_clr_3p %>% mutate(value_nat = value / scale_sd)
head(coefs_clr_3p)

Reconstruct the temporally dynamic coefficients

First we reconstruct the hourly coefficients for the model with no harmonics. This step isn’t necessary as we already have the coefficients, and we have already rescaled them in the dataframe we created above. But as we are also fitting harmonic models and recover their coefficients across time, we have used the same approach here so then we can plot them together and illustrate the static/dynamic outputs of the models. It also means that we can use the same simulation code (which indexes across the hour of the day), and just change the data frame of coefficients (as it will index across the coefficients of the static model but they won’t change).

We need a sequence of values that covers a full period (or the period that we want to construct the function over, which can be more or less than 1 period). The sequence can be arbitrarily finely spaced. The smaller the increment the smoother the function will be for plotting. When simulating data from the temporally dynamic coefficients, we will subset to the increment that relates to the data collection and model fitting (i.e. one hour in this case).

Essentially, the coefficients can be considered as weights of the harmonics, which combine into a single function.

Now we can reconstruct the harmonic function using the formula that we put into our model by interacting the harmonic terms with each of the covariates, for two pairs of harmonics (2p) a single covariate, let’s say herbaceous vegetation (herby), this would be written down as:

\[ f = \beta_{herby} + \beta_{herby\_s1} \sin\left(\frac{2\pi t}{24}\right) + \beta_{herby\_c1} \cos\left(\frac{2\pi t}{24}\right) + \beta_{herby\_s2} \sin\left(\frac{4\pi t}{24}\right) + \beta_{herby\_c2} \cos\left(\frac{4\pi t}{24}\right), \]

where we have 5 \(\beta_{herby}\) coefficients, one for the linear term, and one for each of the harmonic terms.

Here we use matrix multiplication to reconstruct the temporally dynamic coefficients. We provide some background in the Ecography_DynamicSSF_Walkthrough_Harmonics_and_selection_surfaces script.

First we create a matrix of the values of the harmonics, which is just the sin and cos terms for each harmonic, and then we can multiply this by the coefficients to get the function. When we use two pairs of harmonics we will have 5 coefficients for each covariate (linear + 2 sine and 2 cosine), so there will be 5 columns in the matrix.

For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the second matrix. The result will then have the same number of rows as the first matrix and the same number of columns as the second matrix.

Or in other words, if we have a 24 x 5 matrix of harmonics and a 5 x 1 matrix of coefficients, we will get a 24 x 1 matrix of the function, which corresponds to our 24 hours of the day.

Code
# increments are arbitrary - finer results in smoother curves
# for the simulations we will subset to the step interval
hour <- seq(0,23.9,0.1) 

# create the dataframe of values of the harmonic terms over the period (here just the linear term)
hour_harmonics_df_0p <- data.frame("linear_term" = rep(1, length(hour)))

harmonics_scaled_df_0p <- data.frame(
  "hour" = hour,
  "ndvi" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("ndvi", coefs) & !grepl("sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "ndvi_2" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("ndvi_sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "canopy" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("canopy", coefs) & !grepl("sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "canopy_2" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("canopy_sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "slope" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("slope", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "herby" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("herby", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "sl" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("step_l", coefs) & !grepl("log", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "log_sl" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("log_step_l", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "cos_ta" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("cos", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))))

harmonics_scaled_long_0p <- pivot_longer(harmonics_scaled_df_0p, 
                                         cols = !1, 
                                         names_to = "coef")
Code
# create the dataframe of values of the harmonic terms over the period
hour_harmonics_df_1p <- data.frame("linear_term" = rep(1, length(hour)),
                                "hour_s1" = sin(2*pi*hour/24),
                                "hour_c1" = cos(2*pi*hour/24))

harmonics_scaled_df_1p <- data.frame(
  "hour" = hour,
  "ndvi" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("ndvi", coefs) & !grepl("sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "ndvi_2" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("ndvi_sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "canopy" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("canopy", coefs) & !grepl("sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "canopy_2" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("canopy_sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "slope" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("slope", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "herby" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("herby", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "sl" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("step_l", coefs) & !grepl("log", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "log_sl" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("log_step_l", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "cos_ta" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("cos", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))))

harmonics_scaled_long_1p <- pivot_longer(harmonics_scaled_df_1p, 
                                         cols = !1, 
                                         names_to = "coef")
Code
# create the dataframe of values of the harmonic terms over the period
hour_harmonics_df_2p <- data.frame("linear_term" = rep(1, length(hour)),
                                "hour_s1" = sin(2*pi*hour/24),
                                "hour_s2" = sin(4*pi*hour/24),
                                "hour_c1" = cos(2*pi*hour/24),
                                "hour_c2" = cos(4*pi*hour/24))

harmonics_scaled_df_2p <- data.frame(
  "hour" = hour,
  "ndvi" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("ndvi", coefs) & !grepl("sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "ndvi_2" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("ndvi_sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "canopy" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("canopy", coefs) & !grepl("sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "canopy_2" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("canopy_sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "slope" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("slope", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "herby" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("herby", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "sl" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("step_l", coefs) & !grepl("log", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "log_sl" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("log_step_l", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "cos_ta" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("cos", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))))

harmonics_scaled_long_2p <- pivot_longer(harmonics_scaled_df_2p, cols = !1, 
                                         names_to = "coef")
Code
# create the dataframe of values of the harmonic terms over the period
hour_harmonics_df_3p <- data.frame("linear_term" = rep(1, length(hour)),
                                "hour_s1" = sin(2*pi*hour/24),
                                "hour_s2" = sin(4*pi*hour/24),
                                "hour_s3" = sin(6*pi*hour/24),
                                "hour_c1" = cos(2*pi*hour/24),
                                "hour_c2" = cos(4*pi*hour/24),
                                "hour_c3" = cos(6*pi*hour/24))

harmonics_scaled_df_3p <- data.frame(
  "hour" = hour,
  "ndvi" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("ndvi", coefs) & !grepl("sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "ndvi_2" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("ndvi_sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "canopy" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("canopy", coefs) & !grepl("sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "canopy_2" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("canopy_sq", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "slope" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("slope", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "herby" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("herby", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "sl" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("step_l", coefs) & !grepl("log", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "log_sl" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("log_step_l", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "cos_ta" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("cos", coefs)) %>% 
      pull(value) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))))

harmonics_scaled_long_3p <- pivot_longer(harmonics_scaled_df_3p, cols = !1, 
                                         names_to = "coef")

Plot the results - scaled temporally dynamic coefficients

Here we show the temporally-varying coefficients across time (which are currently still scaled).

Code
ggplot() +
    geom_path(data = harmonics_scaled_long_0p,
              aes(x = hour, y = value, colour = coef)) +
    geom_hline(yintercept = 0, linetype = "dashed") +
    scale_y_continuous(expression(Time-varying~parameter~values~beta)) +
    scale_x_continuous("Hour") +
    scale_color_discrete("Estimate") +
    theme_classic() +
    theme(legend.position = "bottom")

Code
ggplot() +
    geom_path(data = harmonics_scaled_long_1p,
              aes(x = hour, y = value, colour = coef)) +
    geom_hline(yintercept = 0, linetype = "dashed") +
    scale_y_continuous(expression(Time-varying~parameter~values~beta)) +
    scale_x_continuous("Hour") +
    scale_color_discrete("Estimate") +
    theme_classic() +
    theme(legend.position = "bottom")

Code
ggplot() +
    geom_path(data = harmonics_scaled_long_2p,
              aes(x = hour, y = value, colour = coef)) +
    geom_hline(yintercept = 0, linetype = "dashed") +
    scale_y_continuous(expression(Time-varying~parameter~values~beta)) +
    scale_x_continuous("Hour") +
    scale_color_discrete("Estimate") +
    theme_classic() +
    theme(legend.position = "bottom")

Code
ggplot() +
    geom_path(data = harmonics_scaled_long_3p,
              aes(x = hour, y = value, colour = coef)) +
    geom_hline(yintercept = 0, linetype = "dashed") +
    scale_y_continuous(expression(Time-varying~parameter~values~beta)) +
    scale_x_continuous("Hour") +
    scale_color_discrete("Estimate") +
    theme_classic() +
    theme(legend.position = "bottom")

Reconstructing the natural-scale temporally dynamic coefficients

As we scaled the covariate values prior to fitting the models, we want to rescale the coefficients to their natural scale. This is important for the simulations, as the environmental variables will not be scaled when we simulate steps.

Code
harmonics_nat_df_0p <- data.frame(
  "hour" = hour,
  "ndvi" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("ndvi", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "ndvi_2" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("ndvi_sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "canopy" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("canopy", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "canopy_2" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("canopy_sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "slope" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("slope", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "herby" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("herby", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "sl" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("step_l", coefs) & !grepl("log", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "log_sl" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("log_step_l", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))),
  "cos_ta" = as.numeric(
    coefs_clr_0p %>% dplyr::filter(grepl("cos", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_0p))))
Code
harmonics_nat_df_1p <- data.frame(
  "hour" = hour,
  "ndvi" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("ndvi", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "ndvi_2" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("ndvi_sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "canopy" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("canopy", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "canopy_2" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("canopy_sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "slope" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("slope", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "herby" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("herby", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "sl" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("step_l", coefs) & !grepl("log", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "log_sl" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("log_step_l", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))),
  "cos_ta" = as.numeric(
    coefs_clr_1p %>% dplyr::filter(grepl("cos", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_1p))))
Code
harmonics_nat_df_2p <- data.frame(
  "hour" = hour,
  "ndvi" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("ndvi", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "ndvi_2" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("ndvi_sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "canopy" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("canopy", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "canopy_2" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("canopy_sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "slope" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("slope", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "herby" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("herby", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "sl" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("step_l", coefs) & !grepl("log", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "log_sl" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("log_step_l", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))),
  "cos_ta" = as.numeric(
    coefs_clr_2p %>% dplyr::filter(grepl("cos", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_2p))))
Code
harmonics_nat_df_3p <- data.frame(
  "hour" = hour,
  "ndvi" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("ndvi", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "ndvi_2" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("ndvi_sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "canopy" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("canopy", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "canopy_2" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("canopy_sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "slope" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("slope", coefs) & !grepl("sq", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "herby" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("herby", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "sl" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("step_l", coefs) & !grepl("log", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "log_sl" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("log_step_l", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))),
  "cos_ta" = as.numeric(
    coefs_clr_3p %>% dplyr::filter(grepl("cos", coefs)) %>% 
      pull(value_nat) %>% t() %*% t(as.matrix(hour_harmonics_df_3p))))

Update the Gamma and von Mises distributions

To update the Gamma and von Mises distribution from the tentative distributions (e.g. Fieberg et al. 2021, Appendix C), we just do the calculation at each time point (for the natural-scale coefficients).

Code
# from the step generation script
tentative_shape <- 0.438167
tentative_scale <- 534.3507
tentative_kappa <- 0.1848126

hour_coefs_nat_df_0p <- harmonics_nat_df_0p %>% 
  mutate(shape = tentative_shape + log_sl,
         scale = 1/((1/tentative_scale) - sl),
         kappa = tentative_kappa + cos_ta)

# save the coefficients to use in the simulations
write_csv(hour_coefs_nat_df_0p,
          paste0("ssf_coefficients/id", focal_id, "_0pDaily_coefs_dry_", Sys.Date(), ".csv"))

# turning into a long data frame
hour_coefs_nat_long_0p <- pivot_longer(hour_coefs_nat_df_0p, 
                                       cols = !1, 
                                       names_to = "coef")
Code
hour_coefs_nat_df_1p <- harmonics_nat_df_1p %>% 
  mutate(shape = tentative_shape + log_sl,
         scale = 1/((1/tentative_scale) - sl),
         kappa = tentative_kappa + cos_ta)

# save the coefficients to use in the simulations
write_csv(hour_coefs_nat_df_1p,
          paste0("ssf_coefficients/id", focal_id, "_1pDaily_coefs_dry_",Sys.Date(), ".csv"))

# turning into a long data frame
hour_coefs_nat_long_1p <- pivot_longer(hour_coefs_nat_df_1p, 
                                       cols = !1, names_to = "coef")
Code
hour_coefs_nat_df_2p <- harmonics_nat_df_2p %>% 
  mutate(shape = tentative_shape + log_sl,
         scale = 1/((1/tentative_scale) - sl),
         kappa = tentative_kappa + cos_ta)

# save the coefficients to use in the simulations
write_csv(hour_coefs_nat_df_2p,
          paste0("ssf_coefficients/id", focal_id, "_2pDaily_coefs_dry_",Sys.Date(), ".csv"))

# turning into a long data frame
hour_coefs_nat_long_2p <- pivot_longer(hour_coefs_nat_df_2p, cols = !1, 
                                       names_to = "coef")
Code
hour_coefs_nat_df_3p <- harmonics_nat_df_3p %>% 
  mutate(shape = tentative_shape + log_sl,
         scale = 1/((1/tentative_scale) - sl),
         kappa = tentative_kappa + cos_ta)

# save the coefficients to use in the simulations
write_csv(hour_coefs_nat_df_3p,
          paste0("ssf_coefficients/id", focal_id, "_3pDaily_coefs_dry_", Sys.Date(), ".csv"))

# turning into a long data frame
hour_coefs_nat_long_3p <- pivot_longer(hour_coefs_nat_df_3p, cols = !1, 
                                       names_to = "coef")

Plot the natural-scale temporally dynamic coefficients

Now that the coefficients are in their natural scales, they will be larger or smaller depending on the scale of the covariate.

Plot just the habitat selection coefficients.

Code
ggplot() +
  geom_path(data = hour_coefs_nat_long_0p %>%
              filter(!coef %in% c("shape", "scale", "kappa")),
            aes(x = hour, y = value, colour = coef)) +
  geom_hline(yintercept = 0, linetype = "dashed") +
  scale_y_continuous(expression(Time-varying~parameter~values~beta)) +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_color_discrete("Estimate") +
  theme_classic() +
  theme(legend.position = "bottom")

Code
ggplot() +
  geom_path(data = hour_coefs_nat_long_1p %>%
              filter(!coef %in% c("shape", "scale", "kappa")),
            aes(x = hour, y = value, colour = coef)) +
  geom_hline(yintercept = 0, linetype = "dashed") +
  scale_y_continuous(expression(Time-varying~parameter~values~beta)) +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_color_discrete("Estimate") +
  theme_classic() +
  theme(legend.position = "bottom")

Code
ggplot() +
  geom_path(data = hour_coefs_nat_long_2p %>%
              filter(!coef %in% c("shape", "scale", "kappa")),
            aes(x = hour, y = value, colour = coef)) +
  geom_hline(yintercept = 0, linetype = "dashed") +
  scale_y_continuous(expression(Time-varying~parameter~values~beta)) +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_color_discrete("Estimate") +
  theme_classic() +
  theme(legend.position = "bottom")

Code
ggplot() +
  geom_path(data = hour_coefs_nat_long_3p %>%
              filter(!coef %in% c("shape", "scale", "kappa")),
            aes(x = hour, y = value, colour = coef)) +
  geom_hline(yintercept = 0, linetype = "dashed") +
  scale_y_continuous(expression(Time-varying~parameter~values~beta)) +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_color_discrete("Estimate") +
  theme_classic() +
  theme(legend.position = "bottom")

Plot only the temporally dynamic movement parameters

Code
ggplot() +
    geom_path(data = hour_coefs_nat_long_0p %>% 
              filter(coef %in% c("shape", "kappa")),
              aes(x = hour, y = value, colour = coef)) +
  geom_path(data = hour_coefs_nat_long_0p %>%
              filter(coef == "scale"),
              aes(x = hour, y = value/1000, colour = coef)) +
    geom_hline(yintercept = 0, linetype = "dashed") +
    scale_y_continuous(expression(beta)) +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  ggtitle("Note that the scale parameter is divided by 1000 for plotting") +
  scale_color_discrete("Estimate",
      labels = c("kappa" = "von Mises kappa",
                 "scale" = "Gamma scale / 1000",
                 "shape" = "Gamma shape")) +
    theme_classic() +
    theme(legend.position = "right")

Code
ggplot() +
    geom_path(data = hour_coefs_nat_long_1p %>% 
              filter(coef %in% c("shape", "kappa")),
              aes(x = hour, y = value, colour = coef)) +
  geom_path(data = hour_coefs_nat_long_1p %>%
              filter(coef == "scale"),
              aes(x = hour, y = value/1000, colour = coef)) +
    geom_hline(yintercept = 0, linetype = "dashed") +
    scale_y_continuous(expression(beta)) +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  ggtitle("Note that the scale parameter is divided by 1000 for plotting") +
  scale_color_discrete("Estimate",
      labels = c("kappa" = "von Mises kappa",
                 "scale" = "Gamma scale / 1000",
                 "shape" = "Gamma shape")) +
    theme_classic() +
    theme(legend.position = "right")

Code
ggplot() +
    geom_path(data = hour_coefs_nat_long_2p %>% 
              filter(coef %in% c("shape", "kappa")),
              aes(x = hour, y = value, colour = coef)) +
  geom_path(data = hour_coefs_nat_long_2p %>%
              filter(coef == "scale"),
              aes(x = hour, y = value/1000, colour = coef)) +
    geom_hline(yintercept = 0, linetype = "dashed") +
    scale_y_continuous("Value of parameter") +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  ggtitle("*Note that the scale parameter is divided by 1000 for plotting") +
  scale_color_discrete("Estimate",
      labels = c("kappa" = "von Mises kappa",
                 "scale" = "Gamma scale / 1000",
                 "shape" = "Gamma shape")) +
    theme_classic() +
    theme(legend.position = "right")

Code
# ggsave(paste0("outputs/plots/manuscript_figs_R2/temporal_mvmt_params_",
#          Sys.Date(), ".png"),
#   width=150, height=90, units="mm", dpi = 1000)
Code
ggplot() +
    geom_path(data = hour_coefs_nat_long_3p %>% 
              filter(coef %in% c("shape", "kappa")),
              aes(x = hour, y = value, colour = coef)) +
  geom_path(data = hour_coefs_nat_long_3p %>%
              filter(coef == "scale"),
              aes(x = hour, y = value/1000, colour = coef)) +
    geom_hline(yintercept = 0, linetype = "dashed") +
    scale_y_continuous(expression(beta)) +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  ggtitle("Note that the scale parameter is divided by 1000 for plotting") +
  scale_color_discrete("Estimate",
      labels = c("kappa" = "von Mises kappa",
                 "scale" = "Gamma scale / 1000",
                 "shape" = "Gamma shape")) +
    theme_classic() +
    theme(legend.position = "right")

Sample from temporally dynamic movement parameters

Here we sample from the movement kernel to generate a distribution of step lengths for each hour of the day, to assess how well it matches the observed step lengths. This is the ‘selection-free’ movement kernel, so the step lengths and turning angles from the simulations will be different, as the steps will be conditioned on the habitat, but this is a useful diagnostic to assess whether the harmonics are capturing the observed movement dynamics.

Code
# summarise the observed step lengths by hour
movement_summary_buffalo <- buffalo_data %>% 
  filter(y == 1) %>% 
  group_by(id, hour) %>% 
  summarise(mean_sl = mean(sl), median_sl = median(sl))
`summarise()` has grouped output by 'id'. You can override using the `.groups`
argument.
Code
# number of samples at each hour (more = smoother plotting, but slower)
n <- 1e5

gamma_dist_list <- vector(mode = "list", length = nrow(hour_coefs_nat_df_0p))
gamma_mean <- c()
gamma_median <- c()
gamma_ratio <- c()

for(hour_no in 1:nrow(hour_coefs_nat_df_0p)) {
  
  gamma_dist_list[[hour_no]] <- rgamma(n, shape = hour_coefs_nat_df_0p$shape[hour_no], 
                                       scale = hour_coefs_nat_df_0p$scale[hour_no])
  
  gamma_mean[hour_no] <- mean(gamma_dist_list[[hour_no]])
  gamma_median[hour_no] <- median(gamma_dist_list[[hour_no]])
  gamma_ratio[hour_no] <- gamma_mean[hour_no] / gamma_median[hour_no]

}

gamma_df_0p <- data.frame(model = "0p", 
                          hour = hour_coefs_nat_df_0p$hour, 
                          mean = gamma_mean, 
                          median = gamma_median, 
                          ratio = gamma_ratio) 

mean_sl_0p <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = mean_sl, colour = factor(id))) +
  geom_path(data = gamma_df_0p, 
            aes(x = hour, y = mean), colour = "red", linetype = "dashed") +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Mean step length") +
  scale_colour_viridis_d("Buffalo") +
  ggtitle("Observed and modelled mean step length",
          subtitle = "No harmonics") +
  theme_classic() +
  theme(legend.position = "right")

mean_sl_0p

Code
median_sl_0p <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = median_sl, colour = factor(id))) +
  geom_path(data = gamma_df_0p, aes(x = hour, y = median), 
            colour = "red", linetype = "dashed") +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Median step length") +
  scale_colour_viridis_d("Buffalo") +
  ggtitle("Observed and modelled median step length",
          subtitle = "No harmonics") +
  theme_classic() +
  theme(legend.position = "right")

median_sl_0p

Code
# comparing the mean and median step lengths across all hours
# across the hours by individual buffalo
buffalo_data_all %>% filter(y == 1) %>%  group_by(id) %>% 
  summarise(mean_sl = mean(sl), 
            median_sl = median(sl), 
            ratio = mean_sl/median_sl)
Code
# all buffalo
buffalo_data_all %>% filter(y == 1) %>% 
  summarise(mean_sl = mean(sl), 
            median_sl = median(sl), 
            ratio = mean_sl/median_sl)
Code
# fitted model
gamma_df_0p %>% summarise(mean_mean = mean(mean),
                         median_mean = mean(median), 
                         ratio_mean = mean_mean/median_mean)
Code
gamma_dist_list <- vector(mode = "list", length = nrow(hour_coefs_nat_df_1p))
gamma_mean <- c()
gamma_median <- c()
gamma_ratio <- c()

for(hour_no in 1:nrow(hour_coefs_nat_df_1p)) {
  
  gamma_dist_list[[hour_no]] <- rgamma(n, 
                                       shape = hour_coefs_nat_df_1p$shape[hour_no], 
                                       scale = hour_coefs_nat_df_1p$scale[hour_no])
  
  gamma_mean[hour_no] <- mean(gamma_dist_list[[hour_no]])
  gamma_median[hour_no] <- median(gamma_dist_list[[hour_no]])
  gamma_ratio[hour_no] <- gamma_mean[hour_no] / gamma_median[hour_no]

}

gamma_df_1p <- data.frame(model = "1p", 
                          hour = hour_coefs_nat_df_1p$hour, 
                          mean = gamma_mean, 
                          median = gamma_median, 
                          ratio = gamma_ratio) 

mean_sl_1p <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = mean_sl, colour = factor(id))) +
  geom_path(data = gamma_df_1p, 
            aes(x = hour, y = mean), 
            colour = "red", linetype = "dashed") +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Mean step length") +
  scale_colour_viridis_d("Buffalo") +
  ggtitle("Observed and modelled mean step length",
          subtitle = "One pair of harmonics") +
  theme_classic() +
  theme(legend.position = "none")

mean_sl_1p

Code
median_sl_1p <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = median_sl, colour = factor(id))) +
  geom_path(data = gamma_df_1p, 
            aes(x = hour, y = median), 
            colour = "red", linetype = "dashed") +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Median step length") +
  scale_colour_viridis_d("Buffalo") +
  ggtitle("Observed and modelled median step length",
          subtitle = "One pair of harmonics") +
  theme_classic() +
  theme(legend.position = "none")

median_sl_1p

Code
# across the hours
buffalo_data_all %>% filter(y == 1) %>%  group_by(id) %>% 
  summarise(mean_sl = mean(sl), 
            median_sl = median(sl), 
            ratio = mean_sl/median_sl)
Code
buffalo_data_all %>% filter(y == 1) %>% 
  summarise(mean_sl = mean(sl), 
            median_sl = median(sl), 
            ratio = mean_sl/median_sl)
Code
gamma_df_1p %>% summarise(mean_mean = mean(mean),
                      median_mean = mean(median), 
                      ratio_mean = mean_mean/median_mean)
Code
gamma_dist_list <- vector(mode = "list", length = nrow(hour_coefs_nat_df_2p))
gamma_mean <- c()
gamma_median <- c()
gamma_ratio <- c()

for(hour_no in 1:nrow(hour_coefs_nat_df_2p)) {
  
gamma_dist_list[[hour_no]] <- rgamma(n, 
                                     shape = hour_coefs_nat_df_2p$shape[hour_no], 
                                     scale = hour_coefs_nat_df_2p$scale[hour_no])

gamma_mean[hour_no] <- mean(gamma_dist_list[[hour_no]])
gamma_median[hour_no] <- median(gamma_dist_list[[hour_no]])
gamma_ratio[hour_no] <- gamma_mean[hour_no] / gamma_median[hour_no]

}

gamma_df_2p <- data.frame(model = "2p", 
                          hour = hour_coefs_nat_df_2p$hour, 
                          mean = gamma_mean, 
                          median = gamma_median, 
                          ratio = gamma_ratio) 

mean_sl_2p <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = mean_sl, colour = factor(id))) +
  geom_path(data = gamma_df_2p, 
            aes(x = hour, y = mean), 
            colour = "red", linetype = "dashed") +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Mean step length") +
  scale_colour_viridis_d("Buffalo") +
  ggtitle("Observed and modelled mean step length",
          subtitle = "Two pairs of harmonics") +
  theme_classic() +
  theme(legend.position = "none")

mean_sl_2p

Code
median_sl_2p <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = median_sl, colour = factor(id))) +
  geom_path(data = gamma_df_2p, 
            aes(x = hour, y = median), 
            colour = "red", linetype = "dashed") +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Median step length") +
  scale_colour_viridis_d("Buffalo") +
  ggtitle("Observed and modelled median step length",
          subtitle = "Two pairs of harmonics") +
  theme_classic() +
  theme(legend.position = "none")

median_sl_2p

Code
# across the hours
buffalo_data_all %>% filter(y == 1) %>%  group_by(id) %>% 
  summarise(mean_sl = mean(sl), 
            median_sl = median(sl), 
            ratio = mean_sl/median_sl)
Code
buffalo_data_all %>% filter(y == 1) %>% 
  summarise(mean_sl = mean(sl), 
            median_sl = median(sl), 
            ratio = mean_sl/median_sl)
Code
gamma_df_2p %>% summarise(mean_mean = mean(mean),
                         median_mean = mean(median), 
                         ratio_mean = mean_mean/median_mean)
Code
gamma_dist_list <- vector(mode = "list", length = nrow(hour_coefs_nat_df_3p))
gamma_mean <- c()
gamma_median <- c()
gamma_ratio <- c()

for(hour_no in 1:nrow(hour_coefs_nat_df_3p)) {
  
gamma_dist_list[[hour_no]] <- rgamma(n, 
                                     shape = hour_coefs_nat_df_3p$shape[hour_no], 
                                     scale = hour_coefs_nat_df_3p$scale[hour_no])

gamma_mean[hour_no] <- mean(gamma_dist_list[[hour_no]])
gamma_median[hour_no] <- median(gamma_dist_list[[hour_no]])
gamma_ratio[hour_no] <- gamma_mean[hour_no] / gamma_median[hour_no]

}

gamma_df_3p <- data.frame(model = "3p", 
                          hour = hour_coefs_nat_df_3p$hour, 
                          mean = gamma_mean, 
                          median = gamma_median, 
                          ratio = gamma_ratio)

mean_sl_3p <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = mean_sl, colour = factor(id))) +
  geom_path(data = gamma_df_3p, 
            aes(x = hour, y = mean), 
            colour = "red", linetype = "dashed") +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Mean step length") +
  scale_colour_viridis_d("Buffalo") +
  ggtitle("Observed and modelled mean step length",
          subtitle = "Three pairs of harmonics") +
  theme_classic() +
  theme(legend.position = "none")

mean_sl_3p

Code
median_sl_3p <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = median_sl, colour = factor(id))) +
  geom_path(data = gamma_df_3p, 
            aes(x = hour, y = median), 
            colour = "red", linetype = "dashed") +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Median step length") +
  scale_colour_viridis_d("Buffalo") +
  ggtitle("Observed and modelled median step length",
          subtitle = "Three pairs of harmonics") +
  theme_classic() +
  theme(legend.position = "none")

median_sl_3p

Code
# across the hours
buffalo_data_all %>% filter(y == 1) %>%  group_by(id) %>% 
  summarise(mean_sl = mean(sl), 
            median_sl = median(sl), 
            ratio = mean_sl/median_sl)
Code
buffalo_data_all %>% filter(y == 1) %>% 
  summarise(mean_sl = mean(sl), 
            median_sl = median(sl), 
            ratio = mean_sl/median_sl)
Code
gamma_df_3p %>% summarise(mean_mean = mean(mean),
                      median_mean = mean(median), 
                      ratio_mean = mean_mean/median_mean)

Creating selection surfaces

As we have both quadratic and harmonic terms in the model, we can reconstruct a ‘selection surface’ to visualise how the animal’s respond to environmental features changes through time.

To illustrate, if we don’t have temporal dynamics (as is the case for this model), then we have a coefficient for the linear term and a coefficient for the quadratic term. Using these, we can plot the selection curve at the scale of the environmental variable (in this case NDVI).

Using the natural scale coefficients from the model:

Code
# first get a sequence of NDVI values, 
# starting from the minimum observed in the data to the maximum
ndvi_min <- min(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_max <- max(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_seq <- seq(ndvi_min, ndvi_max, by = 0.01)

# take the coefficients from the model and calculation the selection value 
# for every NDVI value in this sequence

# we can separate to the linear term
ndvi_linear_selection <- hour_coefs_nat_df_0p$ndvi[1] * ndvi_seq 
plot(x = ndvi_seq, y = ndvi_linear_selection,
     main = "Selection for NDVI - linear term",
     xlab = "NDVI", ylab = "Estimated selection")
lines(ndvi_seq, rep(0,length(ndvi_seq)), lty = "dashed")

Code
# and the quadratic term
ndvi_quadratic_selection <- (hour_coefs_nat_df_0p$ndvi_2[1] * (ndvi_seq ^ 2))
plot(x = ndvi_seq, y = ndvi_quadratic_selection,
     main = "Selection for NDVI - quadratic term",
     xlab = "NDVI", ylab = "Estimated selection")
lines(ndvi_seq, rep(0,length(ndvi_seq)), lty = "dashed")

Code
# and the sum of both
ndvi_sum_selection <- ndvi_linear_selection + ndvi_quadratic_selection
plot(x = ndvi_seq, y = ndvi_sum_selection,
     main = "Selection for NDVI - sum of linear and quadratic terms",
     xlab = "NDVI", ylab = "Estimated selection")
lines(ndvi_seq, rep(0,length(ndvi_seq)), lty = "dashed")

When there are no temporal dynamics, then this quadratic curve will be the same throughout the day, but when we have temporally dynamic coefficients for both the linear term and the quadratic term, then we will have a curves that vary continuously throughout the day, which we can visualise as a selection surface.

Here we illustrate for the model with 2 pairs of harmonic terms.

For brevity we won’t plot the linear and quadratic terms separately, but we can do so if needed.

First for Hour 3

Code
hour_no <- 3

# we can separate to the linear term
ndvi_linear_selection <- 
  hour_coefs_nat_df_1p$ndvi[which(hour_coefs_nat_df_1p$hour == hour_no)] * ndvi_seq
# plot(x = ndvi_seq, y = ndvi_linear_selection,
#      main = "Selection for NDVI - linear term",
#      xlab = "NDVI", ylab = "Estimated selection")

# and the quadratic term
ndvi_quadratic_selection <- 
  (hour_coefs_nat_df_1p$ndvi_2[which(hour_coefs_nat_df_1p$hour == hour_no)] * (ndvi_seq ^ 2))
# plot(x = ndvi_seq, y = ndvi_quadratic_selection,
#      main = "Selection for NDVI - quadratic term",
#      xlab = "NDVI", ylab = "Estimated selection")

# and the sum of both
ndvi_sum_selection <- ndvi_linear_selection + ndvi_quadratic_selection
plot(x = ndvi_seq, y = ndvi_sum_selection,
     main = "Selection for NDVI - sum of linear and quadratic terms",
     xlab = "NDVI", ylab = "Estimated selection")
lines(ndvi_seq, rep(0,length(ndvi_seq)), lty = "dashed")

We can see that the coefficient at hour 3 shows highest selection for NDVI values slightly above 0.2, and the coefficient is mostly negative.

Secondly for Hour 12

Code
hour_no <- 12

# we can separate to the linear term
ndvi_linear_selection <- 
  hour_coefs_nat_df_1p$ndvi[which(hour_coefs_nat_df_1p$hour == hour_no)] * ndvi_seq
# plot(x = ndvi_seq, y = ndvi_linear_selection,
#      main = "Selection for NDVI - linear term",
#      xlab = "NDVI", ylab = "Estimated selection")

# and the quadratic term
ndvi_quadratic_selection <- 
  (hour_coefs_nat_df_1p$ndvi_2[which(hour_coefs_nat_df_1p$hour == hour_no)] * (ndvi_seq ^ 2))
# plot(x = ndvi_seq, y = ndvi_quadratic_selection,
#      main = "Selection for NDVI - quadratic term",
#      xlab = "NDVI", ylab = "Estimated selection")

# and the sum of both
ndvi_sum_selection <- ndvi_linear_selection + ndvi_quadratic_selection
plot(x = ndvi_seq, y = ndvi_sum_selection,
     main = "Selection for NDVI - sum of linear and quadratic terms",
     xlab = "NDVI", ylab = "Estimated selection")
lines(ndvi_seq, rep(0,length(ndvi_seq)), lty = "dashed")

Whereas for hour 12, the coefficient shows highest selection for NDVI values slightly above 0.4, and the coefficient is positive for NDVI values above 0.

We can imagine viewing these plots for every hour of the day, where each hour has a different quadratic curve, but this would be a lot of plots. We can also see it as a 3D surface, where the x-axis is the hour of the day, the y-axis is the NDVI value, and the z-axis (colour) is the coefficient value.

We simply index over the linear and quadratic terms and calculate the coefficient values at every time point.

NDVI selection surface

Code
ndvi_min <- min(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_max <- max(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_seq <- seq(ndvi_min, ndvi_max, by = 0.01)

# Create empty data frame
ndvi_fresponse_df <- data.frame(matrix(ncol = nrow(hour_coefs_nat_df_0p), 
                                       nrow = length(ndvi_seq)))

# loop over each time increment, calculating the selection values for each NDVI value
# and storing each time increment as a column in a dataframe that we can use for plotting
for(i in 1:nrow(hour_coefs_nat_df_0p)) {
  # Assign the vector as a column to the dataframe
  ndvi_fresponse_df[,i] <- (hour_coefs_nat_df_0p$ndvi[i] * ndvi_seq) + 
    (hour_coefs_nat_df_0p$ndvi_2[i] * (ndvi_seq ^ 2))
}

ndvi_fresponse_df <- data.frame(ndvi_seq, ndvi_fresponse_df)
colnames(ndvi_fresponse_df) <- c("ndvi", hour)
ndvi_fresponse_long <- pivot_longer(ndvi_fresponse_df, 
                                    cols = !1, names_to = "hour")

ndvi_contour_max <- max(ndvi_fresponse_long$value) # 0.7890195
ndvi_contour_min <- min(ndvi_fresponse_long$value) # -0.7945691
ndvi_contour_increment <- (ndvi_contour_max-ndvi_contour_min)/10

ndvi_quad_0p <- ggplot(data = ndvi_fresponse_long, 
                       aes(x = as.numeric(hour), y = ndvi)) +
  geom_point(aes(colour = value)) + 
  geom_contour(aes(z = value), 
               breaks = seq(ndvi_contour_increment, 
                            ndvi_contour_max, 
                            ndvi_contour_increment), 
               colour = "black", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), 
               breaks = seq(-ndvi_contour_increment, 
                            ndvi_contour_min, 
                            -ndvi_contour_increment), 
               colour = "red", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), breaks = 0, colour = "black", linewidth = 0.5) +
  scale_x_continuous("Hour", breaks = seq(0,24,6)) +
  scale_y_continuous("NDVI value", breaks = seq(-1, 1, 0.25)) +
  scale_colour_viridis_c("Selection") +
  ggtitle("Normalised Difference Vegetation Index (NDVI)") +
  theme_classic() +
  theme(legend.position = "none")

ndvi_quad_0p

Code
ndvi_min <- min(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_max <- max(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_seq <- seq(ndvi_min, ndvi_max, by = 0.01)

# Create empty data frame
ndvi_fresponse_df <- data.frame(matrix(ncol = nrow(hour_coefs_nat_df_1p), 
                                       nrow = length(ndvi_seq)))

for(i in 1:nrow(hour_coefs_nat_df_1p)) {
  # Assign the vector as a column to the dataframe
  ndvi_fresponse_df[,i] <- (hour_coefs_nat_df_1p$ndvi[i] * ndvi_seq) + 
    (hour_coefs_nat_df_1p$ndvi_2[i] * (ndvi_seq ^ 2))
}

ndvi_fresponse_df <- data.frame(ndvi_seq, ndvi_fresponse_df)
colnames(ndvi_fresponse_df) <- c("ndvi", hour)
ndvi_fresponse_long <- pivot_longer(ndvi_fresponse_df, cols = !1, names_to = "hour")

ndvi_contour_max <- max(ndvi_fresponse_long$value) # 0.7890195
ndvi_contour_min <- min(ndvi_fresponse_long$value) # -0.7945691
ndvi_contour_increment <- (ndvi_contour_max-ndvi_contour_min)/10

ndvi_quad_1p <- ggplot(data = ndvi_fresponse_long, 
                       aes(x = as.numeric(hour), y = ndvi)) +
  geom_point(aes(colour = value)) + # colour = "white"
  geom_contour(aes(z = value), 
               breaks = seq(ndvi_contour_increment, 
                            ndvi_contour_max, 
                            ndvi_contour_increment), 
               colour = "black", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), 
               breaks = seq(-ndvi_contour_increment, 
                            ndvi_contour_min, 
                            -ndvi_contour_increment), 
               colour = "red", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), breaks = 0, colour = "black", linewidth = 0.5) +
  scale_x_continuous("Hour", breaks = seq(0,24,6)) +
  scale_y_continuous("NDVI value", breaks = seq(-1, 1, 0.25)) +
  scale_colour_viridis_c("Selection") +
  # ggtitle("Normalised Difference Vegetation Index (NDVI)") +
  theme_classic() +
  theme(legend.position = "none")

ndvi_quad_1p

Code
ndvi_min <- min(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_max <- max(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_seq <- seq(ndvi_min, ndvi_max, by = 0.01)

# Create empty data frame
ndvi_fresponse_df <- data.frame(matrix(ncol = nrow(hour_coefs_nat_df_2p), 
                                       nrow = length(ndvi_seq)))

for(i in 1:nrow(hour_coefs_nat_df_2p)) {
  # Assign the vector as a column to the dataframe
  ndvi_fresponse_df[,i] <- (hour_coefs_nat_df_2p$ndvi[i] * ndvi_seq) + 
    (hour_coefs_nat_df_2p$ndvi_2[i] * (ndvi_seq ^ 2))
}

ndvi_fresponse_df <- data.frame(ndvi_seq, ndvi_fresponse_df)
colnames(ndvi_fresponse_df) <- c("ndvi", hour)
ndvi_fresponse_long <- pivot_longer(ndvi_fresponse_df, cols = !1, 
                                    names_to = "hour")

ndvi_contour_max <- max(ndvi_fresponse_long$value) # 0.7890195
ndvi_contour_min <- min(ndvi_fresponse_long$value) # -0.7945691
ndvi_contour_increment <- (ndvi_contour_max-ndvi_contour_min)/10

ndvi_quad_2p <- ggplot(data = ndvi_fresponse_long, 
                       aes(x = as.numeric(hour), y = ndvi)) +
  geom_point(aes(colour = value)) + # colour = "white"
  geom_contour(aes(z = value), 
               breaks = seq(ndvi_contour_increment, 
                            ndvi_contour_max, 
                            ndvi_contour_increment), 
               colour = "black", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), 
               breaks = seq(-ndvi_contour_increment, 
                            ndvi_contour_min, 
                            -ndvi_contour_increment), 
               colour = "red", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), breaks = 0, colour = "black", linewidth = 0.5) +
  scale_x_continuous("Hour", breaks = seq(0,24,6)) +
  scale_y_continuous("NDVI value", breaks = seq(-1, 1, 0.25)) +
  scale_colour_viridis_c("Selection") +
  # ggtitle("Normalised Difference Vegetation Index (NDVI)") +
  theme_classic() +
  theme(legend.position = "right")

ndvi_quad_2p

Code
# ggsave(paste0("outputs/plots/manuscript_figs_R2/ndvi_selection_surface_legend_",
#          Sys.Date(), ".png"),
#   width=170, height=90, units="mm", dpi = 1000)
Code
ndvi_min <- min(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_max <- max(buffalo_data$ndvi_temporal, na.rm = TRUE)
ndvi_seq <- seq(ndvi_min, ndvi_max, by = 0.01)

# Create empty data frame
ndvi_fresponse_df <- data.frame(matrix(ncol = nrow(hour_coefs_nat_df_3p), 
                                       nrow = length(ndvi_seq)))

for(i in 1:nrow(hour_coefs_nat_df_3p)) {
  # Assign the vector as a column to the dataframe
  ndvi_fresponse_df[,i] <- (hour_coefs_nat_df_3p$ndvi[i] * ndvi_seq) + 
    (hour_coefs_nat_df_3p$ndvi_2[i] * (ndvi_seq ^ 2))
}

ndvi_fresponse_df <- data.frame(ndvi_seq, ndvi_fresponse_df)
colnames(ndvi_fresponse_df) <- c("ndvi", hour)
ndvi_fresponse_long <- pivot_longer(ndvi_fresponse_df, cols = !1, 
                                    names_to = "hour")

ndvi_contour_max <- max(ndvi_fresponse_long$value) # 0.7890195
ndvi_contour_min <- min(ndvi_fresponse_long$value) # -0.7945691
ndvi_contour_increment <- (ndvi_contour_max-ndvi_contour_min)/10

ndvi_quad_3p <- ggplot(data = ndvi_fresponse_long, 
                       aes(x = as.numeric(hour), y = ndvi)) +
  geom_point(aes(colour = value)) + # colour = "white"
  geom_contour(aes(z = value), 
               breaks = seq(ndvi_contour_increment, 
                            ndvi_contour_max, 
                            ndvi_contour_increment), 
               colour = "black", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), 
               breaks = seq(-ndvi_contour_increment, 
                            ndvi_contour_min, 
                            -ndvi_contour_increment), 
               colour = "red", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), breaks = 0, colour = "black", linewidth = 0.5) +
  scale_x_continuous("Hour", breaks = seq(0,24,6)) +
  scale_y_continuous("NDVI value", breaks = seq(-1, 1, 0.25)) +
  scale_colour_viridis_c("Selection") +
  # ggtitle("Normalised Difference Vegetation Index (NDVI)") +
  theme_classic() +
  theme(legend.position = "none")

ndvi_quad_3p

Canopy cover selection surface

Code
canopy_min <- min(buffalo_data$canopy_01, na.rm = TRUE)
canopy_max <- max(buffalo_data$canopy_01, na.rm = TRUE)
canopy_seq <- seq(canopy_min, canopy_max, by = 0.01)

# Create empty data frame
canopy_fresponse_df <- data.frame(matrix(ncol = nrow(hour_coefs_nat_df_0p), 
                                         nrow = length(canopy_seq)))

for(i in 1:nrow(hour_coefs_nat_df_0p)) {
  # Assign the vector as a column to the dataframe
  canopy_fresponse_df[,i] <- (hour_coefs_nat_df_0p$canopy[i] * canopy_seq) + 
    (hour_coefs_nat_df_0p$canopy_2[i] * (canopy_seq ^ 2))
}

canopy_fresponse_df <- data.frame(canopy_seq, canopy_fresponse_df)
colnames(canopy_fresponse_df) <- c("canopy", hour)
canopy_fresponse_long <- pivot_longer(canopy_fresponse_df, 
                                      cols = !1, 
                                      names_to = "hour")

canopy_contour_min <- min(canopy_fresponse_long$value) # 0
canopy_contour_max <- max(canopy_fresponse_long$value) # 2.181749
canopy_contour_increment <- (canopy_contour_max-canopy_contour_min)/10

canopy_quad_0p <- ggplot(data = canopy_fresponse_long, aes(x = as.numeric(hour), 
                                                           y = canopy)) +
  geom_point(aes(colour = value)) +
  geom_contour(aes(z = value), 
               breaks = seq(canopy_contour_increment, 
                            canopy_contour_max, 
                            -canopy_contour_increment), 
               colour = "black", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value),
  breaks = seq(-canopy_contour_increment, 
               canopy_contour_min,
               -canopy_contour_increment),
               colour = "red", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), breaks = 0, colour = "black", linewidth = 0.5) +
  scale_x_continuous("Hour", breaks = seq(0,24,6)) +
  scale_y_continuous("Canopy cover", breaks = seq(0, 1, 0.25)) +
  scale_colour_viridis_c("Selection") +
  ggtitle("Canopy Cover") +
  theme_classic() +
  theme(legend.position = "none")

canopy_quad_0p
Warning: `stat_contour()`: Zero contours were generated
Warning in min(x): no non-missing arguments to min; returning Inf
Warning in max(x): no non-missing arguments to max; returning -Inf

Code
canopy_min <- min(buffalo_data$canopy_01, na.rm = TRUE)
canopy_max <- max(buffalo_data$canopy_01, na.rm = TRUE)
canopy_seq <- seq(canopy_min, canopy_max, by = 0.01)

# Create empty data frame
canopy_fresponse_df <- data.frame(matrix(ncol = nrow(hour_coefs_nat_df_1p), 
                                         nrow = length(canopy_seq)))

for(i in 1:nrow(hour_coefs_nat_df_1p)) {
  # Assign the vector as a column to the dataframe
  canopy_fresponse_df[,i] <- (hour_coefs_nat_df_1p$canopy[i] * canopy_seq) + 
    (hour_coefs_nat_df_1p$canopy_2[i] * (canopy_seq ^ 2))
}

canopy_fresponse_df <- data.frame(canopy_seq, canopy_fresponse_df)
colnames(canopy_fresponse_df) <- c("canopy", hour)
canopy_fresponse_long <- pivot_longer(canopy_fresponse_df, cols = !1, 
                                      names_to = "hour")

canopy_contour_min <- min(canopy_fresponse_long$value) # 0
canopy_contour_max <- max(canopy_fresponse_long$value) # 2.181749
canopy_contour_increment <- (canopy_contour_max-canopy_contour_min)/10

canopy_quad_1p <- ggplot(data = canopy_fresponse_long, 
                         aes(x = as.numeric(hour), y = canopy)) +
  geom_point(aes(colour = value)) +
  geom_contour(aes(z = value), 
               breaks = seq(canopy_contour_increment, 
                            canopy_contour_max, 
                            canopy_contour_increment), 
               colour = "black", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value),
               breaks = seq(-canopy_contour_increment, 
                            canopy_contour_min, 
                            -canopy_contour_increment),
               colour = "red", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), breaks = 0, colour = "black", linewidth = 0.5) +
  scale_x_continuous("Hour", breaks = seq(0,24,6)) +
  scale_y_continuous("Canopy cover", breaks = seq(0, 1, 0.25)) +
  scale_colour_viridis_c("Selection") +
  # ggtitle("Canopy Cover") +
  theme_classic() +
  theme(legend.position = "none")

canopy_quad_1p

Code
canopy_min <- min(buffalo_data$canopy_01, na.rm = TRUE)
canopy_max <- max(buffalo_data$canopy_01, na.rm = TRUE)
canopy_seq <- seq(canopy_min, canopy_max, by = 0.01)

# Create empty data frame
canopy_fresponse_df <- data.frame(matrix(ncol = nrow(hour_coefs_nat_df_2p), 
                                         nrow = length(canopy_seq)))

for(i in 1:nrow(hour_coefs_nat_df_2p)) {
  # Assign the vector as a column to the dataframe
  canopy_fresponse_df[,i] <- (hour_coefs_nat_df_2p$canopy[i] * canopy_seq) + 
    (hour_coefs_nat_df_2p$canopy_2[i] * (canopy_seq ^ 2))
}

canopy_fresponse_df <- data.frame(canopy_seq, canopy_fresponse_df)
colnames(canopy_fresponse_df) <- c("canopy", hour)
canopy_fresponse_long <- pivot_longer(canopy_fresponse_df, cols = !1, 
                                      names_to = "hour")

canopy_contour_min <- min(canopy_fresponse_long$value) # 0
canopy_contour_max <- max(canopy_fresponse_long$value) # 2.181749
canopy_contour_increment <- (canopy_contour_max-canopy_contour_min)/10

canopy_quad_2p <- ggplot(data = canopy_fresponse_long, 
                         aes(x = as.numeric(hour), y = canopy)) +
  geom_point(aes(colour = value)) +
  geom_contour(aes(z = value), 
               breaks = seq(canopy_contour_increment, 
                            canopy_contour_max, 
                            canopy_contour_increment), 
               colour = "black", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value),
               breaks = seq(-canopy_contour_increment, 
                            canopy_contour_min, 
                            -canopy_contour_increment),
               colour = "red", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), breaks = 0, colour = "black", linewidth = 0.5) +
  scale_x_continuous("Hour", breaks = seq(0,24,6)) +
  scale_y_continuous("Canopy cover", breaks = seq(0, 1, 0.25)) +
  scale_colour_viridis_c("Selection") +
  # ggtitle("Canopy Cover") +
  theme_classic() +
  theme(legend.position = "none")

canopy_quad_2p

Code
canopy_min <- min(buffalo_data$canopy_01, na.rm = TRUE)
canopy_max <- max(buffalo_data$canopy_01, na.rm = TRUE)
canopy_seq <- seq(canopy_min, canopy_max, by = 0.01)

# Create empty data frame
canopy_fresponse_df <- data.frame(matrix(ncol = nrow(hour_coefs_nat_df_3p), 
                                         nrow = length(canopy_seq)))

for(i in 1:nrow(hour_coefs_nat_df_3p)) {
  # Assign the vector as a column to the dataframe
  canopy_fresponse_df[,i] <- (hour_coefs_nat_df_3p$canopy[i] * canopy_seq) + 
    (hour_coefs_nat_df_3p$canopy_2[i] * (canopy_seq ^ 2))
}

canopy_fresponse_df <- data.frame(canopy_seq, canopy_fresponse_df)
colnames(canopy_fresponse_df) <- c("canopy", hour)
canopy_fresponse_long <- pivot_longer(canopy_fresponse_df, cols = !1, 
                                      names_to = "hour")

canopy_contour_min <- min(canopy_fresponse_long$value) # 0
canopy_contour_max <- max(canopy_fresponse_long$value) # 2.181749
canopy_contour_increment <- (canopy_contour_max-canopy_contour_min)/10

canopy_quad_3p <- ggplot(data = canopy_fresponse_long, 
                         aes(x = as.numeric(hour), y = canopy)) +
  geom_point(aes(colour = value)) +
  geom_contour(aes(z = value), 
               breaks = seq(canopy_contour_increment, 
                            canopy_contour_max, 
                            canopy_contour_increment), 
               colour = "black", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value),
               breaks = seq(-canopy_contour_increment, 
                            canopy_contour_min, 
                            -canopy_contour_increment),
               colour = "red", linewidth = 0.25, linetype = "dashed") +
  geom_contour(aes(z = value), breaks = 0, colour = "black", linewidth = 0.5) +
  scale_x_continuous("Hour", breaks = seq(0,24,6)) +
  scale_y_continuous("Canopy cover", breaks = seq(0, 1, 0.25)) +
  scale_colour_viridis_c("Selection") +
  # ggtitle("Canopy Cover", 
  #         subtitle = "Three pairs of harmonics") +
  theme_classic() +
  theme(legend.position = "none")

canopy_quad_3p

Combining the plots

Movement parameters

Code
gamma_df <- rbind(gamma_df_0p, gamma_df_1p, gamma_df_2p, gamma_df_3p)
gamma_df <- gamma_df %>% mutate(model_f = as.numeric(factor(model)))

mean_sl <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = mean_sl, group = factor(id)), 
            alpha = 0.25) +
  geom_path(data = gamma_df, aes(x = hour, y = mean, linetype = model)) +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Mean step length") +
  scale_linetype_manual("Model", breaks=c("0p","1p", "2p", "3p"), 
                        values=c(4,3,2,1)) +
  ggtitle("Observed and modelled mean step length") +
  theme_classic() +
  theme(legend.position = "bottom")

mean_sl

Code
# ggsave(paste0("outputs/plots/manuscript_figs_R1/mean_sl_", 
#          Sys.Date(), ".png"),
#   width=150, height=90, units="mm", dpi = 1000)

median_sl <- ggplot() +
  geom_path(data = movement_summary_buffalo, 
            aes(x = hour, y = median_sl, group = factor(id)), 
            alpha = 0.25) +
  geom_path(data = gamma_df, aes(x = hour, y = median, linetype = model)) +
  scale_x_continuous("Hour", breaks = seq(0,24,2)) +
  scale_y_continuous("Median step length") +
  scale_linetype_manual("Model", breaks=c("0p","1p", "2p", "3p"), 
                        values=c(4,3,2,1)) +
  ggtitle("Observed and modelled median step length") +
  theme_classic() +
  theme(legend.position = "bottom")

median_sl

Code
# ggsave(paste0("outputs/plots/manuscript_figs_R1/median_sl_", 
#          Sys.Date(), ".png"),
#   width=150, height=90, units="mm", dpi = 1000)

Habitat selection

Code
harmonics_scaled_long_0p <- harmonics_scaled_long_0p %>% mutate(model = "0p")
harmonics_scaled_long_1p <- harmonics_scaled_long_1p %>% mutate(model = "1p")
harmonics_scaled_long_2p <- harmonics_scaled_long_2p %>% mutate(model = "2p")
harmonics_scaled_long_3p <- harmonics_scaled_long_3p %>% mutate(model = "3p")

harmonics_scaled_long_Mp <- rbind(harmonics_scaled_long_0p, 
                                  harmonics_scaled_long_1p, 
                                  harmonics_scaled_long_2p, 
                                  harmonics_scaled_long_3p)

coef_titles <- unique(harmonics_scaled_long_Mp$coef)


ndvi_harms <- ggplot() +
      geom_path(data = harmonics_scaled_long_Mp %>%
                filter(coef == "ndvi"), 
                aes(x = hour, y = value, linetype = model)) +
      geom_hline(yintercept = 0, linetype = "dashed", colour = "red") +
      scale_y_continuous(expression(beta)) +
      scale_x_continuous("Hour") +
  scale_linetype_manual("Model", breaks=c("0p","1p", "2p", "3p"), 
                        values=c(4,3,2,1)) +
      ggtitle("NDVI") +
      theme_classic() +
      theme(legend.position = "bottom")

ndvi_harms

Code
ndvi_2_harms <- ggplot() +
      geom_path(data = harmonics_scaled_long_Mp %>%
                filter(coef == "ndvi_2"), 
                aes(x = hour, y = value, linetype = model)) +
      geom_hline(yintercept = 0, linetype = "dashed", colour = "red") +
      scale_y_continuous(expression(beta)) +
      scale_x_continuous("Hour") +
  scale_linetype_manual("Model", breaks=c("0p","1p", "2p", "3p"), 
                        values=c(4,3,2,1)) +
      ggtitle(expression(NDVI^2)) +
      theme_classic() +
      theme(legend.position = "bottom")

ndvi_2_harms

Code
canopy_harms <- ggplot() +
      geom_path(data = harmonics_scaled_long_Mp %>%
                filter(coef == "canopy"), 
                aes(x = hour, y = value, linetype = model)) +
      geom_hline(yintercept = 0, linetype = "dashed", colour = "red") +
      scale_y_continuous(expression(beta)) +
      scale_x_continuous("Hour") +
  scale_linetype_manual("Model", breaks=c("0p","1p", "2p", "3p"), 
                        values=c(4,3,2,1)) +
      ggtitle("Canopy cover") +
      theme_classic() +
      theme(legend.position = "bottom")

canopy_harms

Code
canopy_2_harms <- ggplot() +
      geom_path(data = harmonics_scaled_long_Mp %>%
                filter(coef == "canopy_2"), 
                aes(x = hour, y = value, linetype = model)) +
      geom_hline(yintercept = 0, linetype = "dashed", colour = "red") +
      scale_y_continuous(expression(beta)) +
      scale_x_continuous("Hour") +
  scale_linetype_manual("Model", breaks=c("0p","1p", "2p", "3p"), 
                        values=c(4,3,2,1)) +
      ggtitle(expression(Canopy~cover^2)) +
      theme_classic() +
      theme(legend.position = "bottom")

canopy_2_harms

Code
herby_harms <- ggplot() +
      geom_path(data = harmonics_scaled_long_Mp %>%
                filter(coef == "herby"), 
                aes(x = hour, y = value, linetype = model)) +
      geom_hline(yintercept = 0, linetype = "dashed", colour = "red") +
      scale_y_continuous(expression(beta), limits = c(-0.4,0.15)) +
      scale_x_continuous("Hour") +
  scale_linetype_manual("Model", breaks=c("0p","1p", "2p", "3p"), 
                        values=c(4,3,2,1)) +
      ggtitle("Herbaceous vegetation") +
      theme_classic() +
      theme(legend.position = "bottom")

herby_harms
Warning: Removed 22 rows containing missing values or values outside the scale range
(`geom_path()`).

Code
slope_harms <- ggplot() +
      geom_path(data = harmonics_scaled_long_Mp %>%
                filter(coef == "slope"), 
                aes(x = hour, y = value, linetype = model)) +
      geom_hline(yintercept = 0, linetype = "dashed", colour = "red") +
      scale_y_continuous(expression(beta), limits = c(-0.4,0.15)) +
      scale_x_continuous("Hour") +
  scale_linetype_manual("Model", breaks=c("0p","1p", "2p", "3p"),
                        values=c(4,3,2,1)) +
      ggtitle("Slope") +
      theme_classic() +
      theme(legend.position = "bottom")

slope_harms

Code
ggarrange(herby_harms, 
          slope_harms, 
          labels = c("A", "B"), 
          ncol = 2, nrow = 1,
          align = "hv",
          legend = "bottom", 
          common.legend = TRUE)
Warning: Removed 22 rows containing missing values or values outside the scale range
(`geom_path()`).
Removed 22 rows containing missing values or values outside the scale range
(`geom_path()`).

Code
# ggsave(paste0("outputs/plots/manuscript_figs_R1/herby_slope_harmonic_functions_", 
#          Sys.Date(), ".png"),
#   width=150, height=90, units="mm", dpi = 1000)

Combining selection surfaces

NDVI

  • A = 0p model
  • B = 1p model
  • C = 2p model
  • D = 3p model
Code
ggarrange(ndvi_quad_0p + theme(plot.title = element_blank(),
                               axis.title.x = element_blank(),
                               axis.text.x = element_blank()), 
          
          ndvi_quad_1p + theme(plot.title = element_blank(), 
                               axis.title.x = element_blank(),
                               axis.text.x = element_blank(),
                               axis.title.y = element_blank(),
                               ), 
          
          ndvi_quad_2p, 
          
          ndvi_quad_3p + theme(plot.title = element_blank(), 
                               axis.title.y = element_blank(),
                               ),
          
          labels = c("A", "B", "C", "D"),
          ncol = 2, nrow = 2,
          legend = "none",
          common.legend = TRUE)

Code
# ggsave(paste0("outputs/plots/manuscript_figs_R1/",
#               "NDVI_2x2_CLR_TS_daily_GvM_10rs_", 
#          Sys.Date(), ".png"),
#   width=150, height=120, units="mm", dpi = 1000)

Canopy cover

  • A = 0p model
  • B = 1p model
  • C = 2p model
  • D = 3p model
Code
ggarrange(canopy_quad_0p + theme(plot.title = element_blank(),
                               axis.title.x = element_blank(),
                               axis.text.x = element_blank()), 
          
          canopy_quad_1p + theme(plot.title = element_blank(), 
                               axis.title.x = element_blank(),
                               axis.text.x = element_blank(),
                               axis.title.y = element_blank(),
                               ), 
          
          canopy_quad_2p, 
          
          canopy_quad_3p + theme(plot.title = element_blank(), 
                               axis.title.y = element_blank(),
                               ),
          
          labels = c("A", "B", "C", "D"),
          ncol = 2, nrow = 2,
          legend = "none",
          common.legend = TRUE)
Warning: `stat_contour()`: Zero contours were generated
Warning in min(x): no non-missing arguments to min; returning Inf
Warning in max(x): no non-missing arguments to max; returning -Inf
Warning: `stat_contour()`: Zero contours were generated
Warning in min(x): no non-missing arguments to min; returning Inf
Warning in max(x): no non-missing arguments to max; returning -Inf

Code
# ggsave(paste0("outputs/plots/manuscript_figs_R1/",
#               "canopy_2x2_CLR_TS_daily_GvM_10rs_", 
#          Sys.Date(), ".png"),
#   width=150, height=120, units="mm", dpi = 1000)

Adding all selection surfaces to the same plot

We combine these plots into the plot that is in the paper. On the top is the NDVI selection surface, and on the bottom is the canopy cover selection surface.

Code
surface_plots_0p <- ggarrange(ndvi_quad_0p + 
            ggtitle("0p") +
            theme(axis.title.x = element_blank(),
                  axis.text.x = element_blank()), 
          
          canopy_quad_0p +  
            scale_x_continuous("Hour", breaks = c(0,12,24)) +
            theme(plot.title = element_blank()), 
         
          ncol = 1, nrow = 2,
          align = "v",
          legend = "none",
          common.legend = TRUE)
Scale for x is already present.
Adding another scale for x, which will replace the existing scale.
Warning: `stat_contour()`: Zero contours were generated
Warning in min(x): no non-missing arguments to min; returning Inf
Warning in max(x): no non-missing arguments to max; returning -Inf
Warning: `stat_contour()`: Zero contours were generated
Warning in min(x): no non-missing arguments to min; returning Inf
Warning in max(x): no non-missing arguments to max; returning -Inf
Code
surface_plots_0p

Code
surface_plots_1p <- ggarrange(ndvi_quad_1p + 
            ggtitle("1p") +
            theme(axis.title.x = element_blank(),
                  axis.text.x = element_blank(),
                  axis.title.y = element_blank(),
                  axis.text.y = element_blank()), 
          
          canopy_quad_1p + 
            theme(plot.title = element_blank(), 
                  axis.title.y = element_blank(),
                  axis.text.y = element_blank()),
          
          ncol = 1, nrow = 2,
          align = "v",
          legend = "none",
          common.legend = TRUE)

surface_plots_1p

Code
surface_plots_2p <- ggarrange(ndvi_quad_2p + 
            ggtitle("2p") +
            theme(axis.title.x = element_blank(),
                  axis.text.x = element_blank(),
                  axis.title.y = element_blank(),
                  axis.text.y = element_blank()),
          
          canopy_quad_2p + 
            theme(plot.title = element_blank(), 
                  axis.title.y = element_blank(),
                  axis.text.y = element_blank()),
          
          ncol = 1, nrow = 2,
          align = "v",
          legend = "none",
          common.legend = TRUE) 

surface_plots_2p

Code
surface_plots_3p <- ggarrange(ndvi_quad_3p + 
            ggtitle("3p") +
              theme(axis.title.x = element_blank(),
                    axis.text.x = element_blank(),
                    axis.title.y = element_blank(),
                    axis.text.y = element_blank()),
           
           canopy_quad_3p + 
            theme(plot.title = element_blank(), 
                  axis.title.y = element_blank(),
                  axis.text.y = element_blank()),
          
          ncol = 1, nrow = 2,
          align = "v",
          legend = "none",
          common.legend = TRUE)

surface_plots_3p

All selection surfaces

Code
all_selection_surfaces <- ggarrange(surface_plots_0p, surface_plots_1p, surface_plots_2p, surface_plots_3p,
          ncol = 4, nrow = 1
          # legend = "none",
          # legend.grob = get_legend(ndvi_quad_2p)
          )

all_selection_surfaces

Code
# ggsave(paste0("outputs/plots/manuscript_figs_R1/",
#               "all_quad_4x1_CLR_TS_daily_GvM_10rs_", 
#          Sys.Date(), ".png"),
#   width=150, height=110, units="mm", dpi = 1000)

References

Fieberg, John, Johannes Signer, Brian Smith, and Tal Avgar. 2021. “A ’How to’ Guide for Interpreting Parameters in Habitat-Selection Analyses.” The Journal of Animal Ecology 90 (5): 1027–43. https://doi.org/10.1111/1365-2656.13441.
Forrest, Scott W, Dan Pagendam, Michael Bode, Christopher Drovandi, Jonathan R Potts, Justin Perry, Eric Vanderduys, and Andrew J Hoskins. 2024. “Predicting Fine‐scale Distributions and Emergent Spatiotemporal Patterns from Temporally Dynamic Step Selection Simulations.” Ecography, December. https://doi.org/10.1111/ecog.07421.

Session info

Code
sessionInfo()
R version 4.4.1 (2024-06-14 ucrt)
Platform: x86_64-w64-mingw32/x64
Running under: Windows 10 x64 (build 19045)

Matrix products: default


locale:
[1] LC_COLLATE=English_Australia.utf8  LC_CTYPE=English_Australia.utf8   
[3] LC_MONETARY=English_Australia.utf8 LC_NUMERIC=C                      
[5] LC_TIME=English_Australia.utf8    

time zone: Australia/Brisbane
tzcode source: internal

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
 [1] scales_1.3.0    patchwork_1.3.0 MASS_7.3-60.2   ggpubr_0.6.0   
 [5] beepr_2.0       tictoc_1.2.1    terra_1.7-78    survival_3.6-4 
 [9] amt_0.2.2.0     lubridate_1.9.3 forcats_1.0.0   stringr_1.5.1  
[13] dplyr_1.1.4     purrr_1.0.2     readr_2.1.5     tidyr_1.3.1    
[17] tibble_3.2.1    ggplot2_3.5.1   tidyverse_2.0.0

loaded via a namespace (and not attached):
 [1] tidyselect_1.2.1   viridisLite_0.4.2  farver_2.1.2       fastmap_1.2.0     
 [5] digest_0.6.37      timechange_0.3.0   lifecycle_1.0.4    sf_1.0-17         
 [9] magrittr_2.0.3     compiler_4.4.1     rlang_1.1.4        tools_4.4.1       
[13] utf8_1.2.4         yaml_2.3.10        knitr_1.48         ggsignif_0.6.4    
[17] labeling_0.4.3     htmlwidgets_1.6.4  bit_4.0.5          classInt_0.4-10   
[21] abind_1.4-8        KernSmooth_2.23-24 withr_3.0.1        grid_4.4.1        
[25] fansi_1.0.6        e1071_1.7-16       colorspace_2.1-1   isoband_0.2.7     
[29] cli_3.6.3          rmarkdown_2.28     crayon_1.5.3       generics_0.1.3    
[33] tzdb_0.4.0         DBI_1.2.3          proxy_0.4-27       audio_0.1-11      
[37] splines_4.4.1      parallel_4.4.1     vctrs_0.6.5        Matrix_1.7-0      
[41] jsonlite_1.8.8     carData_3.0-5      car_3.1-2          hms_1.1.3         
[45] bit64_4.0.5        rstatix_0.7.2      units_0.8-5        glue_1.7.0        
[49] codetools_0.2-20   cowplot_1.1.3      stringi_1.8.4      gtable_0.3.5      
[53] munsell_0.5.1      pillar_1.9.0       htmltools_0.5.8.1  R6_2.5.1          
[57] Rdpack_2.6.1       vroom_1.6.5        evaluate_1.0.0     lattice_0.22-6    
[61] rbibutils_2.2.16   backports_1.5.0    broom_1.0.6        class_7.3-22      
[65] Rcpp_1.0.13        gridExtra_2.3      xfun_0.47          pkgconfig_2.0.3